In this research، a comparison has been made between the robust estimators of (M) for the Cubic Smoothing Splines technique، to avoid the problem of abnormality in data or contamination of error، and the traditional estimation method of Cubic Smoothing Splines technique by using two criteria of differentiation which are (MADE، WASE) for different sample sizes and disparity levels to estimate the chronologically different coefficients functions for the balanced longitudinal data which are characterized by observations obtained through (n) from the independent subjects، each one of them is measured repeatedly by group of specific time points (m)،since the frequent measurements within the subjects are almost connected and independent among the different subjects
This paper discusses using H2 and H∞ robust control approaches for designing control systems. These approaches are applied to elementary control system designs, and their respective implementation and pros and cons are introduced. The H∞ control synthesis mainly enforces closed-loop stability, covering some physical constraints and limitations. While noise rejection and disturbance attenuation are more naturally expressed in performance optimization, which can represent the H2 control synthesis problem. The paper also applies these two methodologies to multi-plant systems to study the stability and performance of the designed controllers. Simulation results show that the H2 controller tracks a desirable cl
... Show MoreIn this research , design and study a (beam expander) for the Nd – YAG laser with (1.06 ?m) Wavelength has been studied at 5X zoom with narrow diversion in the room temperature. by using (ZEMAX) to study the system. Evaluate its performance via (ZEMAX) outputs, as bright Spot Diagram via (RMS), Ray Fan Plot, Geometric Encircled Energy and the value of Focal shift. Then study the effect of field of view on the outputs in the room temperature.
The Weibull distribution is considered one of the Type-I Generalized Extreme Value (GEV) distribution, and it plays a crucial role in modeling extreme events in various fields, such as hydrology, finance, and environmental sciences. Bayesian methods play a strong, decisive role in estimating the parameters of the GEV distribution due to their ability to incorporate prior knowledge and handle small sample sizes effectively. In this research, we compare several shrinkage Bayesian estimation methods based on the squared error and the linear exponential loss functions. They were adopted and compared by the Monte Carlo simulation method. The performance of these methods is assessed based on their accuracy and computational efficiency in estimati
... Show MoreThe two-frequency shell model approach is used to calculate the
ground state matter density distribution and the corresponding root
mean square radii of the two-proton17Ne halo nucleus with the
assumption that the model space of 15O core nucleus differ from the
model space of extra two loosely bound valence protons. Two
different size parameters bcore and bhalo of the single particle wave
functions of the harmonic oscillator potential are used. The
calculations are carried out for different configurations of the outer
halo protons in 17Ne nucleus and the structure of this halo nucleus
shows that the dominant configuration when the two halo protons in
the 1d5/2 orbi
Today, problems of spatial data integration have been further complicated by the rapid development in communication technologies and the increasing amount of available data sources on the World Wide Web. Thus, web-based geospatial data sources can be managed by different communities and the data themselves can vary in respect to quality, coverage, and purpose. Integrating such multiple geospatial datasets remains a challenge for geospatial data consumers. This paper concentrates on the integration of geometric and classification schemes for official data, such as Ordnance Survey (OS) national mapping data, with volunteered geographic information (VGI) data, such as the data derived from the OpenStreetMap (OSM) project. Useful descriptions o
... Show MoreThis paper deals with constructing a model of fuzzy linear programming with application on fuels product of Dura- refinery , which consist of seven products that have direct effect ondaily consumption . After Building the model which consist of objective function represents the selling prices ofthe products and fuzzy productions constraints and fuzzy demand constraints addition to production requirements constraints , we used program of ( WIN QSB ) to find the optimal solution
Different ANN architectures of MLP have been trained by BP and used to analyze Landsat TM images. Two different approaches have been applied for training: an ordinary approach (for one hidden layer M-H1-L & two hidden layers M-H1-H2-L) and one-against-all strategy (for one hidden layer (M-H1-1)xL, & two hidden layers (M-H1-H2-1)xL). Classification accuracy up to 90% has been achieved using one-against-all strategy with two hidden layers architecture. The performance of one-against-all approach is slightly better than the ordinary approach
Design sampling plan was and still one of most importance subjects because it give lowest cost comparing with others, time live statistical distribution should be known to give best estimators for parameters of sampling plan and get best sampling plan.
Research dell with design sampling plan when live time distribution follow Logistic distribution with () as location and shape parameters, using these information can help us getting (number of groups, sample size) associated with reject or accept the Lot
Experimental results for simulated data shows the least number of groups and sample size needs to reject or accept the Lot with certain probability of
... Show MoreThis paper is concerned with finding solutions to free-boundary inverse coefficient problems. Mathematically, we handle a one-dimensional non-homogeneous heat equation subject to initial and boundary conditions as well as non-localized integral observations of zeroth and first-order heat momentum. The direct problem is solved for the temperature distribution and the non-localized integral measurements using the Crank–Nicolson finite difference method. The inverse problem is solved by simultaneously finding the temperature distribution, the time-dependent free-boundary function indicating the location of the moving interface, and the time-wise thermal diffusivity or advection velocities. We reformulate the inverse problem as a non-
... Show MoreHypercholesterolemia is a predominant risk factor for atherosclerosis and cardiovascular disease (CVD). The World Health Organization (WHO), ) recommended reducing the intake of cholesterol and saturated fats. On the other hand, limited evidence is available on the benefits of vegetables in the diet to reduce these risk factors, so this research was conducted to compare the hypolipidemic effect between the extracts of two different types of Iraqi peppers, the fruit of the genus Capsicum traditionally known as red pepper extract (RPE), and Piper nigrum as black pepper extract (BPE), respectively, in different parameters and histology of the liver of the experimental animals. The red pepper was extracted by ethyl acetate, while the black pepp
... Show More