This research introduce a study with application on Principal Component Regression obtained from some of the explainatory variables to limitate Multicollinearity problem among these variables and gain staibilty in their estimations more than those which yield from Ordinary Least Squares. But the cost that we pay in the other hand losing a little power of the estimation of the predictive regression function in explaining the essential variations. A suggested numerical formula has been proposed and applied by the researchers as optimal solution, and vererifing the its efficiency by a program written by the researchers themselves for this porpuse through some creterions: Cumulative Percentage Variance, Coefficient of Determination, Variance Inflation Factor and Estimation Stability.
In this research, we use fuzzy nonparametric methods based on some smoothing techniques, were applied to real data on the Iraqi stock market especially the data about Baghdad company for soft drinks for the year (2016) for the period (1/1/2016-31/12/2016) .A sample of (148) observations was obtained in order to construct a model of the relationship between the stock prices (Low, high, modal) and the traded value by comparing the results of the criterion (G.O.F.) for three techniques , we note that the lowest value for this criterion was for the K-Nearest Neighbor at Gaussian function .
A mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the others
... Show MoreVariable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage
... Show MoreEncryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and
... Show MoreThis work has been carried out to develop national drug product contains 2.5mg/ml clonazepam as oral drop; it is used for the treatment of epilepsy in infants and children.
Several formulations were prepared using oral drop base, flavor, buffer, sweeteners and preservatives. Selection of best formula relied solely on physic-chemical testing of samples.
Stability study was conducted on the product for six months at different temperatures to determine the expiration date and the best storage conditions.
From the study we obtained an oral drop of good clear solution. The expiry date calculated to be not less than 2 years.
The wastewater arising from pulp and paper mills is highly polluted and has to be treated before discharged into rivers. Coagulation-flocculation process using natural polymers has grown rapidly in wastewater treatment. In this work, the performance of alum and Polyaluminum Chloride (PACl) when used alone and when coupled with Fenugreek mucilage on the treatment of pulp and paper mill wastewater were studied. The experiments were carried out in jar tests with alum, PACl and Fenugreek mucilage dosages range of 50-2000 mg/L, rapid mixing at 200 rpm for 2 min, followed by slow mixing at 40 rpm for 15 min and settling time of 30 min. The effectiveness of Fenugreek mucilage was measured by the reduction of turbidity and Chemical Oxygen Demand
... Show MoreThe paper is concerned with the state and proof of the existence theorem of a unique solution (state vector) of couple nonlinear hyperbolic equations (CNLHEQS) via the Galerkin method (GM) with the Aubin theorem. When the continuous classical boundary control vector (CCBCV) is known, the theorem of existence a CCBOCV with equality and inequality state vector constraints (EIESVC) is stated and proved, the existence theorem of a unique solution of the adjoint couple equations (ADCEQS) associated with the state equations is studied. The Frcéhet derivative derivation of the "Hamiltonian" is obtained. Finally the necessary theorem (necessary conditions "NCs") and the sufficient theorem (sufficient conditions" SCs") for optimality of the stat
... Show Moreالمستخلص:
في هذا البحث , استعملنا طرائق مختلفة لتقدير معلمة القياس للتوزيع الاسي كمقدر الإمكان الأعظم ومقدر العزوم ومقدر بيز في ستة أنواع مختلفة عندما يكون التوزيع الأولي لمعلمة القياس : توزيع لافي (Levy) وتوزيع كامبل من النوع الثاني وتوزيع معكوس مربع كاي وتوزيع معكوس كاما وتوزيع غير الملائم (Improper) وتوزيع
... Show MoreIn this study, we present a new steganography method depend on quantizing the perceptual color spaces bands. Four perceptual color spaces are used to test the new method which is HSL, HSV, Lab and Luv, where different algorithms to calculate the last two-color spaces are used. The results reveal the validity of this method as a steganoic method and analysis for the effects of quantization and stegano process on the quality of the cover image and the quality of the perceptual color spaces bands are presented.
The deep learning algorithm has recently achieved a lot of success, especially in the field of computer vision. This research aims to describe the classification method applied to the dataset of multiple types of images (Synthetic Aperture Radar (SAR) images and non-SAR images). In such a classification, transfer learning was used followed by fine-tuning methods. Besides, pre-trained architectures were used on the known image database ImageNet. The model VGG16 was indeed used as a feature extractor and a new classifier was trained based on extracted features.The input data mainly focused on the dataset consist of five classes including the SAR images class (houses) and the non-SAR images classes (Cats, Dogs, Horses, and Humans). The Conv
... Show More