In this research, the one of the most important model and widely used in many and applications is linear mixed model, which widely used to analysis the longitudinal data that characterized by the repeated measures form .where estimating linear mixed model by using two methods (parametric and nonparametric) and used to estimate the conditional mean and marginal mean in linear mixed model ,A comparison between number of models is made to get the best model that will represent the mean wind speed in Iraq.The application is concerned with 8 meteorological stations in Iraq that we selected randomly and then we take a monthly data about wind speed over ten years Then average it over each month in corresponding year, so we get different clusters ,each cluster contain 12 observation that represent a mean wind speed for each station . The comparison among the best models are held by using statistical standard the mean square Error(MSE),our conclusion for the parametric model during the application the with additional random effect(the second model) is better than the model without addithonal random effect(the first model)for all station in general,for nonparametric model we found the conditional local mixed model is better than marginal mixed model in estimation the conditional and marginal means for mixed model in general, for marginal mean , where found that the marginal local mixed model is better for all the stations that we were sampled except for the fifth station we found that the conditional local mixed model is better for the marginal local mixed model in estimation of marginal mean mixed model .
In light of the development in computer science and modern technologies, the impersonation crime rate has increased. Consequently, face recognition technology and biometric systems have been employed for security purposes in a variety of applications including human-computer interaction, surveillance systems, etc. Building an advanced sophisticated model to tackle impersonation-related crimes is essential. This study proposes classification Machine Learning (ML) and Deep Learning (DL) models, utilizing Viola-Jones, Linear Discriminant Analysis (LDA), Mutual Information (MI), and Analysis of Variance (ANOVA) techniques. The two proposed facial classification systems are J48 with LDA feature extraction method as input, and a one-dimen
... Show MoreIn this article, we aim to define a universal set consisting of the subscripts of the fuzzy differential equation (5) except the two elements and , subsets of that universal set are defined according to certain conditions. Then, we use the constructed universal set with its subsets for suggesting an analytical method which facilitates solving fuzzy initial value problems of any order by using the strongly generalized H-differentiability. Also, valid sets with graphs for solutions of fuzzy initial value problems of higher orders are found.
The numerical investigation has been performed to study the radiation affected steady state laminar mixed convection induced by a hot inner varied positions circular core in a horizontal rectangular channel for a fully developed flow. To examine the effects of thermal radiation on thermo fluid dynamics behavior in the eccentric geometry channel, the generalized body fitted co-ordinate system is introduced while the finite difference method is used for solving the radiative transport equation. The governing equations which used are continuity, momentum and energy equations. These equations are normalized and solved using the Vorticity-Stream function. After validating numerical results for the case without radiation, the detailed radiatio
... Show MoreIn this paper we used frequentist and Bayesian approaches for the linear regression model to predict future observations for unemployment rates in Iraq. Parameters are estimated using the ordinary least squares method and for the Bayesian approach using the Markov Chain Monte Carlo (MCMC) method. Calculations are done using the R program. The analysis showed that the linear regression model using the Bayesian approach is better and can be used as an alternative to the frequentist approach. Two criteria, the root mean square error (RMSE) and the median absolute deviation (MAD) were used to compare the performance of the estimates. The results obtained showed that the unemployment rates will continue to increase in the next two decade
... Show MoreMethods of speech recognition have been the subject of several studies over the past decade. Speech recognition has been one of the most exciting areas of the signal processing. Mixed transform is a useful tool for speech signal processing; it is developed for its abilities of improvement in feature extraction. Speech recognition includes three important stages, preprocessing, feature extraction, and classification. Recognition accuracy is so affected by the features extraction stage; therefore different models of mixed transform for feature extraction were proposed. The properties of the recorded isolated word will be 1-D, which achieve the conversion of each 1-D word into a 2-D form. The second step of the word recognizer requires, the
... Show MoreIn this paper, a numerical model for fluid-structure interaction (FSI) analysis is developed for investigating the aeroelastic response of a single wind turbine blade. The Blade Element Momentum (BEM) theory was adopted to calculate the aerodynamic forces considering the effects of wind shear and tower shadow. The wind turbine blade was modeled as a rotating cantilever beam discretized using Finite Element Method (FEM) to analyze the deformation and vibration of the blade. The aeroelastic response of the blade was obtained by coupling these aerodynamic and structural models using a coupled BEM-FEM program written in MATLAB. The governing FSI equations of motion are iteratively calculated at each time step, through exchanging data between
... Show MoreThis paper deals to how to estimate points non measured spatial data when the number of its terms (sample spatial) a few, that are not preferred for the estimation process, because we also know that whenever if the data is large, the estimation results of the points non measured to be better and thus the variance estimate less, so the idea of this paper is how to take advantage of the data other secondary (auxiliary), which have a strong correlation with the primary data (basic) to be estimated single points of non-measured, as well as measuring the variance estimate, has been the use of technique Co-kriging in this field to build predictions spatial estimation process, and then we applied this idea to real data in th
... Show MoreEconomic performance is one of the most important indicators of economic activity and with the performance of the economy progress varied sources of output and increase economic growth rates and per capita national income, and to recover the business environment and increase investment rates and rising effectiveness of the financial and monetary institutions and credit market. Which leads to increased employment rates and reducing unemployment rates and the elimination of many of the social problems and improve the average per capita income as well as improve the level of national income.
The input / output tables is a technique mathematical indicates economic performance
... Show MoreThere many methods for estimation of permeability. In this Paper, permeability has been estimated by two methods. The conventional and modified methods are used to calculate flow zone indicator (FZI). The hydraulic flow unit (HU) was identified by FZI technique. This technique is effective in predicting the permeability in un-cored intervals/wells. HU is related with FZI and rock quality index (RQI). All available cores from 7 wells (Su -4, Su -5, Su -7, Su -8, Su -9, Su -12, and Su -14) were used to be database for HU classification. The plot of probability cumulative of FZI is used. The plot of core-derived probability FZI for both modified and conventional method which indicates 4 Hu (A, B, C and D) for Nahr Umr forma
... Show MoreThis research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear
... Show More