The idea of carrying out research on incomplete data came from the circumstances of our dear country and the horrors of war, which resulted in the missing of many important data and in all aspects of economic, natural, health, scientific life, etc.,. The reasons for the missing are different, including what is outside the will of the concerned or be the will of the concerned, which is planned for that because of the cost or risk or because of the lack of possibilities for inspection. The missing data in this study were processed using Principal Component Analysis and self-organizing map methods using simulation. The variables of child health and variables affecting children's health were taken into account: breastfeeding and maternal health. The maternal health variable contained missing value and was processed in Matlab2015a using Methods Principal Component Analysis and probabilistic Principal Component Analysis of where the missing values were processed and then the methods were compared using the root of the mean error squares. The best method to processed the missing values Was the PCA method.
NeighShrink is an efficient image denoising algorithm based on the discrete wavelet
transform (DWT). Its disadvantage is to use a suboptimal universal threshold and identical
neighbouring window size in all wavelet subbands. Dengwen and Wengang proposed an
improved method, which can determine an optimal threshold and neighbouring window size
for every subband by the Stein’s unbiased risk estimate (SURE). Its denoising performance is
considerably superior to NeighShrink and also outperforms SURE-LET, which is an up-todate
denoising algorithm based on the SURE. In this paper different wavelet transform
families are used with this improved method, the results show that Haar wavelet has the
lowest performance among
New microphotometer was constructed in our Laboratory Which deals with the determination of Molybdenum (VI) through its Catalysis effect on Hydrogen peroxide and potasum iodide Reaction in acid medium H2SO4 0.01 mM. Linearity of 97.3% for the range 5- 100 ppm. The repeatability of result was better than 0.8 % 0.5 ppm was obtanined as L.U. (The method applied for the determination of Molybdenum (VI) in medicinal Sample (centrum). The determination was compared well with the developed method the conventional method.
The achievements of the art that we know today are questioned in motives that differ from what art knew before, including dramatic artistic transformations, which he called modern art.
In view of the enormity of such a topic, its ramifications and its complexity, it was necessary to confine its subject to the origin of the motives of the transformations of its first pioneers, and then to stand on what resulted from that of the data of vision in composition and drawing exclusively, and through exploration in that, we got to know the vitality of change from the art of its time.
And by examining the ruling contemporary philosophical concepts and their new standards and their epistemological role in contemporary life, since they includ
Insurance actions has become a task of the vital foundations on which the international economy depends, where its presence helped in the development of economic resources in which human resource is considered the most important of these resources. Insurance companies play the biggest role in protecting this resource and minimizing the impact of the dangers that verify this condition.Human has worked hard to get rid of the dangers and its harm, and to devise many ways to prevent them. A risk management is considered within human’s creations in order to create a society with fewer negative risks impacts.
On this basis, th
... Show MoreThe theory of probabilistic programming may be conceived in several different ways. As a method of programming it analyses the implications of probabilistic variations in the parameter space of linear or nonlinear programming model. The generating mechanism of such probabilistic variations in the economic models may be due to incomplete information about changes in demand, production and technology, specification errors about the econometric relations presumed for different economic agents, uncertainty of various sorts and the consequences of imperfect aggregation or disaggregating of economic variables. In this Research we discuss the probabilistic programming problem when the coefficient bi is random variable
... Show MoreThe study aims to analyze computer textbooks content for preparatory stage according to the logical thinking. The researcher followed the descriptive analytical research approach (content analysis), and adopted an explicit idea during the analysis process. One of the content analysis tools which was designed based on mental processes employed during logical thinking has utilized to figure out the study results. The findings revealed that logical thinking skills formed (52%) in fourth preparatory textbook and (47%) in fifth preparatory textbook.
The large number of failure in electrical power plant leads to the sudden stopping of work. In some cases, the necessary reserve materials are not available for maintenance which leads to interrupt of power generation in the electrical power plant unit. The present study, deals with the determination of availability aspects of generator in unit 5 of Al-Dourra electric power plant. In order to evaluate this generator's availability performance, a wide range of studies have been conducted to gather accurate information at the level of detail considered suitable to achieve the availability analysis aim. The Weibull Distribution is used to perform the reliability analysis via Minitab 17, and Artificial Neural Networks (ANNs) by approaching o
... Show MoreToday, the role of cloud computing in our day-to-day lives is very prominent. The cloud computing paradigm makes it possible to provide demand-based resources. Cloud computing has changed the way that organizations manage resources due to their robustness, low cost, and pervasive nature. Data security is usually realized using different methods such as encryption. However, the privacy of data is another important challenge that should be considered when transporting, storing, and analyzing data in the public cloud. In this paper, a new method is proposed to track malicious users who use their private key to decrypt data in a system, share it with others and cause system information leakage. Security policies are also considered to be int
... Show MoreSecure storage of confidential medical information is critical to healthcare organizations seeking to protect patient's privacy and comply with regulatory requirements. This paper presents a new scheme for secure storage of medical data using Chaskey cryptography and blockchain technology. The system uses Chaskey encryption to ensure integrity and confidentiality of medical data, blockchain technology to provide a scalable and decentralized storage solution. The system also uses Bflow segmentation and vertical segmentation technologies to enhance scalability and manage the stored data. In addition, the system uses smart contracts to enforce access control policies and other security measures. The description of the system detailing and p
... Show MoreSpatial data analysis is performed in order to remove the skewness, a measure of the asymmetry of the probablitiy distribution. It also improve the normality, a key concept of statistics from the concept of normal distribution “bell shape”, of the properties like improving the normality porosity, permeability and saturation which can be are visualized by using histograms. Three steps of spatial analysis are involved here; exploratory data analysis, variogram analysis and finally distributing the properties by using geostatistical algorithms for the properties. Mishrif Formation (unit MB1) in Nasiriya Oil Field was chosen to analyze and model the data for the first eight wells. The field is an anticline structure with northwest- south
... Show More