This research aims to distinguish the reef environment from the non-reef environment. The Oligocene-Miocene-succussion in western Iraq was selected as a case study, represented by the reefal limestone facies of the Anah Formation (Late Oligocene) deposited in reef-back reef environments, dolomitic limestone of the Euphrates Formation (Early Miocene) deposited in open sea environments, and gypsiferous marly limestone of the Fatha Formation (Middle Miocene) deposited in a lagoonal environment. The content of the rare earth elements (REEs) (La, Ce, Pr, Nd, Sm, Eu, Gd, Tb, Dy, Er, Ho, Tm, Yb, Lu, and Y) in reef facies appear to be much lower than of those in the non-reef facies. The open sea facies have a low content of REEs due to being a transitional phase. The facies investigated have ƩREEs lower than the PAAS. The lagoonal facies shows an average ƩREEs higher than the Global Standard of Carbonate Rocks (GSCR), whereas reefal and open sea facies contain lower. The Y/Ho, Y/Dy, and Er/Nd were used as distinctive indicators of facies diagnosis; reefal facies have a high value of Y/Ho, Y/Dy, and Er/Nd as compared to PAAS are higher than1. In contrast, non-reef facies (lagoonal) have a lower value of Y/Ho, and Y/Dy as compared to PAAS is lower than 1, but Er/Nd is higher than 1. While in open sea facies the Y/Ho and Y/Dy have moderate values as compared to PAAS are close to 1, but a high value of Er/Nd as compared to PAAS higher than 1.
Big data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show MoreData Driven Requirement Engineering (DDRE) represents a vision for a shift from the static traditional methods of doing requirements engineering to dynamic data-driven user-centered methods. Data available and the increasingly complex requirements of system software whose functions can adapt to changing needs to gain the trust of its users, an approach is needed in a continuous software engineering process. This need drives the emergence of new challenges in the discipline of requirements engineering to meet the required changes. The problem in this study was the method in data discrepancies which resulted in the needs elicitation process being hampered and in the end software development found discrepancies and could not meet the need
... Show MoreIn this paper, the process of comparison between the tree regression model and the negative binomial regression. As these models included two types of statistical methods represented by the first type "non parameter statistic" which is the tree regression that aims to divide the data set into subgroups, and the second type is the "parameter statistic" of negative binomial regression, which is usually used when dealing with medical data, especially when dealing with large sample sizes. Comparison of these methods according to the average mean squares error (MSE) and using the simulation of the experiment and taking different sample
... Show MoreImage compression is a suitable technique to reduce the storage space of an image, increase the area of storage in the device, and speed up the transmission process. In this paper, a new idea for image compression is proposed to improve the performance of the Absolute Moment Block Truncation Coding (AMBTC) method depending on Weber's law condition to distinguish uniform blocks (i.e., low and constant details blocks) from non-uniform blocks in original images. Then, all elements in the bitmap of each uniform block are represented by zero. After that, the lossless method, which is Run Length method, is used for compressing the bits more, which represent the bitmap of these uniform blocks. Via this simple idea, the result is improving
... Show MorePolyaniline organic Semiconductor polymer was prepared by oxidation polymerization by adding hydrochloric acid concentration of 0.1M and potassium per sulfate concentration of 0.2M to 0.1M of aniline at room temperature, the polymer was deposited at glass substrate, the structural and optical properties were studies through UV-VIS, IR, XRD measurements, films have been operated as a sensor of vapor H2SO4 and HCl acids.
Gypseous soils are spread in several regions in the world including Iraq, where it covers more than 28.6% [1] of the surface region of the country. This soil, with high gypsum content causes different problems in construction and strategic projects. As a result of water flow through the soil mass, permeability and chemical arrangement of these soils vary over time due to the solubility and leaching of gypsum. In this study the soil of 36% gypsum content, is taken from one location about 100 km (62 mi) southwest of Baghdad, where the sample is taken from depth (0.5 - 1) m below the natural ground surface and mixed with (3%, 6%, 9%) of Copolymer and Styrene-butadiene Rubber to improve t
Adsorption techniques are widely used to remove certain classes of pollutants from wastewater. Phenolic compounds represent one of the problematic groups. Na-Y zeolite has been synthesized from locally available Iraqi kaolin clay. Characterization of the prepared zeolite was made by XRD and surface area measurement using N2 adsorption. Both synthetic Na-Y zeolite and kaolin clay have been tested for adsorption of 4-Nitro-phenol in batch mode experiments. Maximum removal efficiencies of 90% and 80% were obtained using the prepared zeolite and kaolin clay, respectively. Kinetics and equilibrium adsorption isotherms were investigated. Investigations showed that both Langmuir and Freundlich isotherms fit the experimental data quite well. On the
... Show MoreThe assessment of data quality from different sources can be considered as a key challenge in supporting effective geospatial data integration and promoting collaboration in mapping projects. This paper presents a methodology for assessing positional and shape quality for authoritative large-scale data, such as Ordnance Survey (OS) UK data and General Directorate for Survey (GDS) Iraq data, and Volunteered Geographic Information (VGI), such as OpenStreetMap (OSM) data, with the intention of assessing possible integration. It is based on the measurement of discrepancies among the datasets, addressing positional accuracy and shape fidelity, using standard procedures and also directional statistics. Line feature comparison has been und
... Show More