This research deals with a shrinking method concernes with the principal components similar to that one which used in the multiple regression “Least Absolute Shrinkage and Selection: LASS”. The goal here is to make an uncorrelated linear combinations from only a subset of explanatory variables that may have a multicollinearity problem instead taking the whole number say, (K) of them. This shrinkage will force some coefficients to equal zero, after making some restriction on them by some "tuning parameter" say, (t) which balances the bias and variance amount from side, and doesn't exceed the acceptable percent explained variance of these components. This had been shown by MSE criterion in the regression case and the percent explained variance in the principal components case.
Rainwater harvesting could be a possible solution to decrease the consequences of water scarcity and energy deficiency in Iraq and the Kurdistan Region of Iraq (KRI). This study aims to calculate the water and energy (electricity) saved by rainwater harvesting for rooftops and green areas in Sulaimani city, KR, Iraq. Various data were acquired from different formal entities in Sulaimani city. Moreover, Google Earth and ArcMap 10.4 software were used for digitizing and calculating the total rooftop and green areas. The results showed that for the used runoff coefficients (0.8 and 0.95), the harvested rainwater volumes were 2901563 and 12197131 m³ during the study period (2005 – 2006) and (2019-2020). Moreover, by compa
... Show MoreIn this study lattice parameters, band structure, and optical characteristics of pure and V-doped ZnO are examined by employing (USP) and (GGA) with the assistance of First-principles calculation (FPC) derived from (DFT). The measurements are performed in the supercell geometry that were optimized. GGA+U, the geometrical structures of all models, are utilized to compute the amount of energy after optimizing all parameters in the models. The volume of the doped system grows as the content of the dopant V is increased. Pure and V-doped ZnO are investigated for band structure and energy bandgaps using the Monkhorst–Pack scheme's k-point sampling techniques in the Brillouin zone (G-A-H-K-G-M-L-H). In the presence of high V content, the ban
... Show MoreThis paper deals with constructing mixed probability distribution from mixing exponential
Graphene oxide GO was functionalized with 4-amino, 3-substituted 1H, 1, 2, 4 Triazole 5(4H) thion (ASTT) to obtain GOT. GOT characterized by FT-IR, XRD.via modification of the working electrode of the SPCE with the prepared nanomaterial (GOT) the effect of scan rate and pH on the determination of Amoxilline (AMOX) was studied using cyclic voltammetry. AMOX show various responses at pH ranging from 2 to 7 and also was observed sharp increase in the oxidation peaks in the pH 3. The formal potential (midpoint) for AMOX was highly pH-dependent. From the effect of scan rate, surface coverage concentration Γ of electroactive species the values of the electron transfer coefficient and the electron transfer constant rate ket was obtained as 5.39×
... Show MoreCost estimation is considered one of the important tasks in the construction projects management. The precise estimation of the construction cost affect on the success and quality of a construction project. Elemental estimation is considered a very important stage to the project team because it represents one of the key project elements. It helps in formulating the basis to strategies and execution plans for construction and engineering. Elemental estimation, which in the early stage, estimates the construction costs depending on . minimum details of the project so that it gives an indication for the initial design stage of a project. This paper studies the factors that affect the elemental cost estimation as well as the rela
... Show MoreInterval methods for verified integration of initial value problems (IVPs) for ODEs have been used for more than 40 years. For many classes of IVPs, these methods have the ability to compute guaranteed error bounds for the flow of an ODE, where traditional methods provide only approximations to a solution. Overestimation, however, is a potential drawback of verified methods. For some problems, the computed error bounds become overly pessimistic, or integration even breaks down. The dependency problem and the wrapping effect are particular sources of overestimations in interval computations. Berz (see [1]) and his co-workers have developed Taylor model methods, which extend interval arithmetic with symbolic computations. The latter is an ef
... Show MoreThis research seeks to shed light on what you add intangible assets of benefit to the company and this antagonize pause for consideration because it makes the company in a good competitive position stimulates the rest of the companies to acquire those assets.
That many companies have achieved competitive advantages in the market do not even achieved monopolies increased the value and reaped extraordinary profits as a result of those assets which requires the need to be measured to determine the extent to which contribution in the emergence of the value added to the value of the company on the one hand and to make the presentatio
... Show MoreThe research involved a rapid, automated and highly accurate developed CFIA/MZ technique for estimation of phenylephrine hydrochloride (PHE) in pure, dosage forms and biological sample. This method is based on oxidative coupling reaction of 2,4-dinitrophenylhydrazine (DNPH) with PHE in existence of sodium periodate as oxidizing agent in alkaline medium to form a red colored product at ʎmax )520 nm (. A flow rate of 4.3 mL.min-1 using distilled water as a carrier, the method of FIA proved to be as a sensitive and economic analytical tool for estimation of PHE.
Within the concentration range of 5-300 μg.mL-1, a calibration curve was rectilinear, where the detection limit was 3.252 μg.mL
In this research, we present a nonparametric approach for the estimation of a copula density using different kernel density methods. Different functions were used: Gaussian, Gumbel, Clayton, and Frank copula, and through various simulation experiments we generated the standard bivariate normal distribution at samples sizes (50, 100, 250 and 500), in both high and low dependency. Different kernel methods were used to estimate the probability density function of the copula with marginal of this bivariate distribution: Mirror – Reflection (MR), Beta Kernel (BK) and transformation kernel (KD) method, then a comparison was carried out between the three methods with all the experiments using the integrated mean squared error. Furthermore, some
... Show More