This research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear multiplicity between most explanatory variables. These new combinations of linear compounds resulting from the two methods will reduce the number of explanatory variables to reach a new dimension one or more which called the effective dimension. The mean root of the error squares will be used to compare the two methods to show the preference of methods and a simulation study was conducted to compare the methods used. Simulation results showed that the proposed weight standard Sir method is the best.
A theoretical study to design a conformal microstrip antennas was introduced in this work. Conformal microstrip antennas define antennas which can be conformed to a certain shape or to any curved surface. It is used in high-speed trains, aircraft, defense and navigation systems, landing gear and various communications systems, as well as in body wearable. Conformal antennas have some advantages such as a wider-angle coverage compared to flat antennas and low radar cross-sectional (RCS) and they are suitable for using in Radome. The main disadvantage of these antennas is the narrow bandwidth. The FDTD method is extremely useful in simulating complicated structures because it allows for direct integration of Maxwell's equations depending o
... Show MoreAmong the metaheuristic algorithms, population-based algorithms are an explorative search algorithm superior to the local search algorithm in terms of exploring the search space to find globally optimal solutions. However, the primary downside of such algorithms is their low exploitative capability, which prevents the expansion of the search space neighborhood for more optimal solutions. The firefly algorithm (FA) is a population-based algorithm that has been widely used in clustering problems. However, FA is limited in terms of its premature convergence when no neighborhood search strategies are employed to improve the quality of clustering solutions in the neighborhood region and exploring the global regions in the search space. On the
... Show MoreThis research aims to analyze and simulate biochemical real test data for uncovering the relationships among the tests, and how each of them impacts others. The data were acquired from Iraqi private biochemical laboratory. However, these data have many dimensions with a high rate of null values, and big patient numbers. Then, several experiments have been applied on these data beginning with unsupervised techniques such as hierarchical clustering, and k-means, but the results were not clear. Then the preprocessing step performed, to make the dataset analyzable by supervised techniques such as Linear Discriminant Analysis (LDA), Classification And Regression Tree (CART), Logistic Regression (LR), K-Nearest Neighbor (K-NN), Naïve Bays (NB
... Show MoreDecision-makers in each country work to define a list of internal and external interests, goals and threats to their countries according to the nature of their awareness of these interests, goals and threats.
Hence, Iraq is not an exception to this rule, and the process of evaluating its interests and the objectives of its foreign policy is subject to the pattern of awareness of decision-makers and the influencing forces in defining its basic interests, which often witness some kind of difference in defining them, evaluating their importance and determining the size of the threats they face. And among these interests and threats that have witnessed a difference in the assessment of their
... Show MoreAbstract
The grey system model GM(1,1) is the model of the prediction of the time series and the basis of the grey theory. This research presents the methods for estimating parameters of the grey model GM(1,1) is the accumulative method (ACC), the exponential method (EXP), modified exponential method (Mod EXP) and the Particle Swarm Optimization method (PSO). These methods were compared based on the Mean square error (MSE) and the Mean Absolute percentage error (MAPE) as a basis comparator and the simulation method was adopted for the best of the four methods, The best method was obtained and then applied to real data. This data represents the consumption rate of two types of oils a he
... Show MoreEverybody is connected with social media like (Facebook, Twitter, LinkedIn, Instagram…etc.) that generate a large quantity of data and which traditional applications are inadequate to process. Social media are regarded as an important platform for sharing information, opinion, and knowledge of many subscribers. These basic media attribute Big data also to many issues, such as data collection, storage, moving, updating, reviewing, posting, scanning, visualization, Data protection, etc. To deal with all these problems, this is a need for an adequate system that not just prepares the details, but also provides meaningful analysis to take advantage of the difficult situations, relevant to business, proper decision, Health, social media, sc
... Show MoreWe have studied Bayesian method in this paper by using the modified exponential growth model, where this model is more using to represent the growth phenomena. We focus on three of prior functions (Informative, Natural Conjugate, and the function that depends on previous experiments) to use it in the Bayesian method. Where almost of observations for the growth phenomena are depended on one another, which in turn leads to a correlation between those observations, which calls to treat such this problem, called Autocorrelation, and to verified this has been used Bayesian method.
The goal of this study is to knowledge the effect of Autocorrelation on the estimation by using Bayesian method. F
... Show MoreThe problem in the design of a cam is the analyzing of the mechanisms and dynamic forces that effect on the family of parametric polynomials for describing the motion curve. In present method, two ways have been taken for optimization of the cam size, first the high dynamic loading (such that impact and elastic stress waves propagation) from marine machine tool which translate by the roller follower to the cam surface and varies with time causes large contact loads and second it must include the factors of kinematics features including the acceleration, velocity, boundary condition and the unsymmetrical curvature of the cam profile for the motion curve.
In the theoretical solution
... Show MoreHard water does not pose a threat to human health but may cause precipitation of soap or results stone in the boilers. These reactions are caused by the high concentrations of Ca and Mg. In the industry they are undesirable because of higher fuel consumption for industrial use .Electromagnetic polarization water treatment is a method which can be used for increasing the precipitation of Ca 2+ and CO3 2- ions in hard water to form CaCO3 which leads to decrease the water hardness is research has been conducted by changing the number of coil turns and voltage of the system. The spectroscopy electron microscope was used for imaging the produced crystals. Results of the investigation indicated that
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show More