This research deals with a shrinking method concerned with the principal components similar to that one which used in the multiple regression “Least Absolute Shrinkage and Selection: LASS”. The goal here is to make an uncorrelated linear combinations from only a subset of explanatory variables that may have a multicollinearity problem instead taking the whole number say, (K) of them. This shrinkage will force some coefficients to equal zero, after making some restriction on them by some "tuning parameter" say, (t) which balances the bias and variance amount from side, and doesn't exceed the acceptable percent explained variance of these components. This had been shown by MSE criterion in the regression case and the percent explained variance in the principal component case.
Abstract :
Researchers have great interest in studying the black box models this thesis has been focused in the study one of the black box models , a ARMAX model which is one of the important models and can be accessed through a number of special cases which models (AR , MA , ARMA, ARX) , which combines method of the time series that depend on historical data and and regression method as explanatory variables addition to that past errors , ARMAX model importance has appeared in many areas of application that direct contact with our daily lives , it consists of constructing ARMAX model several traditional stages of the process , a iden
... Show MoreAn experimental study on a KIA pride (SAIPA 131) car model with scale of 1:14 in the wind tunnel was made beside the real car tests. Some of the modifications to passive flow control which are (vortex generator, spoiler and slice diffuser) were added to the car to reduce the drag force which its undesirable characteristic that increase fuel consumption and exhaust toxic gases. Two types of calculations were used to determine the drag force acting on the car body. Firstly, is by the integrating the values of pressure recorded along the pressure taps (for the wind tunnel and the real car testing), secondly, is by using one component balance device (wind tunnel testing) to measure the force. The results show that, the avera
... Show MoreMany approaches of different complexity already exist to edge detection in
color images. Nevertheless, the question remains of how different are the results
when employing computational costly techniques instead of simple ones. This
paper presents a comparative study on two approaches to color edge detection to
reduce noise in image. The approaches are based on the Sobel operator and the
Laplace operator. Furthermore, an efficient algorithm for implementing the two
operators is presented. The operators have been applied to real images. The results
are presented in this paper. It is shown that the quality of the results increases by
using second derivative operator (Laplace operator). And noise reduced in a good
Abstract
In this study, mucilage was extracted from Malabar spinach and tested for drag-reducing properties in aqueous liquids flowing through pipelines. Friction produced by liquids flowing in turbulent mode through pipelines increase power consumption. Drag-reducing agents (DRA) such as polymers, suspended solids and surfactants are used to reduce power losses. There is a demand for natural, biodegradable DRA and mucilage is emerging as an attractive alternative to conventional DRAs. Literature review revealed that very little research has been done on the drag-reducing properties of this mucilage and there is an opportunity to explore the potential applications of mucilage from Malabar spinach. An experi
... Show MoreThe research aims to examine the integration effect among resource consumption accounting (RCA) system and the enterprise resource planning (ERP) on both costs reduction and quality improvement. The study questioner form distributed to two different respondents as the unit of analysis. The research reached various conclusions most important of which is the integration relationship can help solve the special difficulties in managing the economic unit data. Moreover, the integration provides a clear picture of the causal relationships between resources, resource quantities, and associated costs
A simple, sensitive and accurate spectrophotometric method has been developed for the determination of salbutamol sulphate (SAB) and isoxsuprine hydrochloride (ISX) in pure and pharmaceutical dosage. The method involved oxidation of (SAB) and (ISX) with a known excess of N-bromosuccinamid in acidic medium, and subsequent occupation of unreacted oxidant in decolorization of Evans blue dye (EB). This, in the presence of SAB or ISX was rectilinear over the ranges 1.0-12.0, 1.0-11.0 µg/mL, with molar absorptivity 4.21×104 and 2.58×104 l.mol-1.cm-1 respectively. The developed method had been successfully applied for the determination of the studied drugs in their pharmaceutical dosage resulting i
... Show MoreThe distribution of the intensity of the comet Ison C/2013 is studied by taking its histogram. This distribution reveals four distinct regions that related to the background, tail, coma and nucleus. One dimensional temperature distribution fitting is achieved by using two mathematical equations that related to the coordinate of the center of the comet. The quiver plot of the gradient of the comet shows very clearly that arrows headed towards the maximum intensity of the comet.
A new distribution, the Epsilon Skew Gamma (ESΓ ) distribution, which was first introduced by Abdulah [1], is used on a near Gamma data. We first redefine the ESΓ distribution, its properties, and characteristics, and then we estimate its parameters using the maximum likelihood and moment estimators. We finally use these estimators to fit the data with the ESΓ distribution
In this paper we will explain ,how use Bayesian procedure in analysis multiple linear regression model with missing data in variables X's as the new method suggest , and explain some of missing Patterns under missing mechanism , missing complete at random MCAR and compare Bayesian estimator with complete case estimator by use simulation procedure .
The Dirichlet process is an important fundamental object in nonparametric Bayesian modelling, applied to a wide range of problems in machine learning, statistics, and bioinformatics, among other fields. This flexible stochastic process models rich data structures with unknown or evolving number of clusters. It is a valuable tool for encoding the true complexity of real-world data in computer models. Our results show that the Dirichlet process improves, both in distribution density and in signal-to-noise ratio, with larger sample size; achieves slow decay rate to its base distribution; has improved convergence and stability; and thrives with a Gaussian base distribution, which is much better than the Gamma distribution. The performance depen
... Show More