In this paper, the bowtie method was utilized by a multidisciplinary team in the Federal Board of Supreme Audit (FBSA)for the purpose of managing corruption risks threatening the Iraqi construction sector. Corruption in Iraq is a widespread phenomenon that threatens to degrade society and halt the wheel of economic development, so it must be reduced through appropriate strategies. A total of eleven corruption risks have been identified by the involved parties in corruption and were analyzed by using probability and impact matrix and their priority has been ranked. Bowtie analysis was conducted on four factors with high score risk in causing corruption in the planning stage. The number and effectiveness of the existing proactive meas
... Show MoreA novel design and implementation of a cognitive methodology for the on-line auto-tuning robust PID controller in a real heating system is presented in this paper. The aim of the proposed work is to construct a cognitive control methodology that gives optimal control signal to the heating system, which achieve the following objectives: fast and precise search efficiency in finding the on- line optimal PID controller parameters in order to find the optimal output temperature response for the heating system. The cognitive methodology (CM) consists of three engines: breeding engine based Routh-Hurwitz criterion stability, search engine based particle
swarm optimization (PSO) and aggregation knowledge engine based cultural algorithm (CA)
The data preprocessing step is an important step in web usage mining because of the nature of log data, which are heterogeneous, unstructured, and noisy. Given the scalability and efficiency of algorithms in pattern discovery, a preprocessing step must be applied. In this study, the sequential methodologies utilized in the preprocessing of data from web server logs, with an emphasis on sub-phases, such as session identification, user identification, and data cleansing, are comprehensively evaluated and meticulously examined.
This paper assesses the impact of changes and fluctuations in bank deposits on the money supply in Iraq. Employing the research constructs an Error Correction Model (ECM) using monthly time series data from 2010 to 2015. The analysis begins with the Phillips-Perron unit root test to ascertain the stationarity of the time series and the Engle and Granger cointegration test to examine the existence of a long-term relationship. Nonparametric regression functions are estimated using two methods: Smoothing Spline and M-smoothing. The results indicate that the M-smoothing approach is the most effective, achieving the shortest adjustment period and the highest adjustment ratio for short-term disturbances, thereby facilitating a return
... Show MoreThe purpose of this paper is to build a simulation model by using HEC-RAS software to simulate the reality of water movement in the main river of Basra City (South of Iraq) which is known as Siraji-Khoura River. The main objective of the simulation is to detect areas where the water cycle is interrupted in some stations of the river stream, as this river has become an outlet for the disposal of sewage, leading to pollution and causing weakness in some sections of the river & obstructing the water cycle that takes place between this river and Shatt al – Arab river. A field survey data of the river and its banks were adopted to derive the grades, longitudinal and cross sections of the river, these data included three-dimensional coordinates
... Show MoreThe issue of the research lies in the non-representation of the models developed for the communication process in the interaction and networking processes through social media, as the research sought to build a network model of communication according to the specific data and features of social media platforms in order to reach a special generalization to understand how the process of networking operates in cyberspace.
The researcher followed the analytical survey approach as she described the communication models outwardly in order to be able to build a networked communication model that represents the flow of post-reactive communication. Therefore, it has been named "Nebula - Sadeem" after the concept of post-space and cosmic g
... Show MoreBig data of different types, such as texts and images, are rapidly generated from the internet and other applications. Dealing with this data using traditional methods is not practical since it is available in various sizes, types, and processing speed requirements. Therefore, data analytics has become an important tool because only meaningful information is analyzed and extracted, which makes it essential for big data applications to analyze and extract useful information. This paper presents several innovative methods that use data analytics techniques to improve the analysis process and data management. Furthermore, this paper discusses how the revolution of data analytics based on artificial intelligence algorithms might provide
... Show MoreAn integrated lithofacies and mineralogical assemblage was used to describe a depositional model and sequence stratigraphic framework of the Maastrichtian–Danian succession in the Western Desert of Iraq and eastern Jordan. Fifteen lithofacies types were grouped into three associations recognized in a distally steepened ramp characterized by an apparent, distinct increase in a gradient paleobathymetric deepening westward. The clay and nonclay minerals are dominated by smectite and palygorskite, with trace amounts of kaolinite, sepiolite, illite and chlorite. Meanwhile, quartz, calcite, dolomite, opal CT (Cristobalite - Tridymite), and apatite are the main nonclay minerals. The widely dominated smectite in the Western Phosphatic Basin of Ir
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show More