The specific activities of the natural radionuclides U-238 and Th-
232 and K-40 in 14 soil samples collected from different sites from
AL-Mustansiriyah university at two depths (topsoil "surface" and
20cm depth) were be investigated using gamma ray spectrometer
3"x3" NaI(Tl) scintillation detector.
The analysis of the energy spectra of the soil samples show that
these samples have specific activities ranging with (16.08-51.11)
Bq/kg for U-238, (14.79-52.29) Bq/kg for Th-232 and (191.08-
377.64) Bq/kg for K-40, with an average values of 29.37, 34.14 and
289.62 Bq/kg for U-238, Th-232, k-40 respectively. The radiation
hazard parameters of the natural radionuclides; radium equivalent
activity (Raeq), gamma absorbed dose rate (Dγ), annual effective dose
rate (Eγ), internal and external hazard index (Hin, Hex) have also been
calculated. The maximum value of specific activities and hazard
parameters was found in the sample of the soil gathered from the
Literature college center. All the calculated specific activates values
were be in the ranges of worldwide averages, and below than the
global permissible limits, this would indicate that the soils of the
University is safety for both students and staff.
Diabetes is one of the increasing chronic diseases, affecting millions of people around the earth. Diabetes diagnosis, its prediction, proper cure, and management are compulsory. Machine learning-based prediction techniques for diabetes data analysis can help in the early detection and prediction of the disease and its consequences such as hypo/hyperglycemia. In this paper, we explored the diabetes dataset collected from the medical records of one thousand Iraqi patients. We applied three classifiers, the multilayer perceptron, the KNN and the Random Forest. We involved two experiments: the first experiment used all 12 features of the dataset. The Random Forest outperforms others with 98.8% accuracy. The second experiment used only five att
... Show MoreShadow detection and removal is an important task when dealing with color outdoor images. Shadows are generated by a local and relative absence of light. Shadows are, first of all, a local decrease in the amount of light that reaches a surface. Secondly, they are a local change in the amount of light rejected by a surface toward the observer. Most shadow detection and segmentation methods are based on image analysis. However, some factors will affect the detection result due to the complexity of the circumstances. In this paper a method of segmentation test present to detect shadows from an image and a function concept is used to remove the shadow from an image.
Eye Detection is used in many applications like pattern recognition, biometric, surveillance system and many other systems. In this paper, a new method is presented to detect and extract the overall shape of one eye from image depending on two principles Helmholtz & Gestalt. According to the principle of perception by Helmholz, any observed geometric shape is perceptually "meaningful" if its repetition number is very small in image with random distribution. To achieve this goal, Gestalt Principle states that humans see things either through grouping its similar elements or recognize patterns. In general, according to Gestalt Principle, humans see things through genera
... Show MoreThe penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreCompressing the speech reduces the data storage requirements, leading to reducing the time of transmitting the digitized speech over long-haul links like internet. To obtain best performance in speech compression, wavelet transforms require filters that combine a number of desirable properties, such as orthogonality and symmetry.The MCT bases functions are derived from GHM bases function using 2D linear convolution .The fast computation algorithm methods introduced here added desirable features to the current transform. We further assess the performance of the MCT in speech compression application. This paper discusses the effect of using DWT and MCT (one and two dimension) on speech compression. DWT and MCT performances in terms of comp
... Show MoreThe Taylor series is defined by the f and g series. The solution to the satellite's equation of motion is expanding to generate Taylor series through the coefficients f and g. In this study, the orbit equation in a perifocal system is solved using the Taylor series, which is based on time changing. A program in matlab is designed to apply the results for a geocentric satellite in low orbit (height from perigee, hp= 622 km). The input parameters were the initial distance from perigee, the initial time, eccentricity, true anomaly, position, and finally the velocity. The output parameters were the final distance from perigee and the final time values. The results of radial distance as opposed to time were plotted for dissimilar times in
... Show MoreAccurate prediction of river water quality parameters is essential for environmental protection and sustainable agricultural resource management. This study presents a novel framework for estimating potential salinity in river water in arid and semi‐arid regions by integrating a kernel extreme learning machine (KELM) with a boosted salp swarm algorithm based on differential evolution (KELM‐BSSADE). A dataset of 336 samples, including bicarbonate, calcium, pH, total dissolved solids and sodium adsorption ratio, was collected from the Idenak station in Iran and was used for the modelling. Results demonstrated that KELM‐BSSADE outperformed models such as deep random vector funct
The research aims to identify banking stress tests, which is one of the modern and important tools in managing banking risks by applying the equations of that tool to the sample. The banking sector considered one of the most vulnerable to sudden and rapid changes in an unstable economic environment, making it more vulnerable. Therefore, it is necessary to establish a special risk management section to reduce the banking risks of the banking business that negatively affect its performance.
The research concluded that there is a direct relationship between stress tests and risk management, as stress tests are an essential tool in risk management. They also considered a unified approach in managing bank risks that helps the bank to
... Show MoreThe two most popular models inwell-known count regression models are Poisson and negative binomial regression models. Poisson regression is a generalized linear model form of regression analysis used to model count data and contingency tables. Poisson regression assumes the response variable Y has a Poisson distribution, and assumes the logarithm of its expected value can be modeled by a linear combination of unknown parameters. Negative binomial regression is similar to regular multiple regression except that the dependent (Y) variables an observed count that follows the negative binomial distribution. This research studies some factors affecting divorce using Poisson and negative binomial regression models. The factors are unemplo
... Show More