This research takes up address the practical side by taking case studies for construction projects that include the various Iraqi governorates, as it includes conducting a field survey to identify the impact of parametric costs on construction projects and compare them with what was reached during the analysis and the extent of their validity and accuracy, as well as adopting the approach of personal interviews to know the reality of the state of construction projects. The results showed, after comparing field data and its measurement in construction projects for the sectors (public and private), the correlation between the expected and actual cost change was (97.8%), and this means that the data can be adopted in the research study of the integration of parametric costs in a predictive model for future study. Changes in the parametric costs of construction projects substantially impact their time, cost, and quality and are a major barrier to their execution, necessitating research, analysis, and the development of the most effective solutions. The study aims to identify the parametric cost accurately through iterative tests and continuous improvements by presenting literature describing the history and characteristics of the parametric cost methodologies and identifying each methodology's limitations, strengths, and weaknesses to promote a better understanding of their best practices and use for managing project cost
It has increasingly been recognised that the future developments in geospatial data handling will centre on geospatial data on the web: Volunteered Geographic Information (VGI). The evaluation of VGI data quality, including positional and shape similarity, has become a recurrent subject in the scientific literature in the last ten years. The OpenStreetMap (OSM) project is the most popular one of the leading platforms of VGI datasets. It is an online geospatial database to produce and supply free editable geospatial datasets for a worldwide. The goal of this paper is to present a comprehensive overview of the quality assurance of OSM data. In addition, the credibility of open source geospatial data is discussed, highlight
... Show MorePredicting permeability is a cornerstone of petroleum reservoir engineering, playing a vital role in optimizing hydrocarbon recovery strategies. This paper explores the application of neural networks to predict permeability in oil reservoirs, underscoring their growing importance in addressing traditional prediction challenges. Conventional techniques often struggle with the complexities of subsurface conditions, making innovative approaches essential. Neural networks, with their ability to uncover complicated patterns within large datasets, emerge as a powerful alternative. The Quanti-Elan model was used in this study to combine several well logs for mineral volumes, porosity and water saturation estimation. This model goes be
... Show MoreModern civilization increasingly relies on sustainable and eco-friendly data centers as the core hubs of intelligent computing. However, these data centers, while vital, also face heightened vulnerability to hacking due to their role as the convergence points of numerous network connection nodes. Recognizing and addressing this vulnerability, particularly within the confines of green data centers, is a pressing concern. This paper proposes a novel approach to mitigate this threat by leveraging swarm intelligence techniques to detect prospective and hidden compromised devices within the data center environment. The core objective is to ensure sustainable intelligent computing through a colony strategy. The research primarily focusses on the
... Show MoreThis paper presents a novel idea as it investigates the rescue effect of the prey with fluctuation effect for the first time to propose a modified predator-prey model that forms a non-autonomous model. However, the approximation method is utilized to convert the non-autonomous model to an autonomous one by simplifying the mathematical analysis and following the dynamical behaviors. Some theoretical properties of the proposed autonomous model like the boundedness, stability, and Kolmogorov conditions are studied. This paper's analytical results demonstrate that the dynamic behaviors are globally stable and that the rescue effect improves the likelihood of coexistence compared to when there is no rescue impact. Furthermore, numerical simul
... Show MoreThe hydrodynamics of a co-current down flow bubble column has been investigated with air – water system. A Perspex bubble column of 5cm in diameter and 1.5m height is used as a test contactor using nozzles of 7, 8 and 9 mm diameter for air-water distributing. The column is provided with three electro-resistivity needle probes for bubble detection.
Experimental work is carried out with air flow rates from 0.09 to 0.45 m3/hr and liquid flow rates from 0.65 to 1.1m3/hr in order to study the effects of superficial gas velocity, nozzle diameter and liquid flow rate on the characteristics of hydrodynamic interactions viz. gas hold up, bubble diameter and bubble velocity by using two technical methods, direct height measurements for air-wa
One technique used to prepare nanoparticles material is Pulsed Laser Ablation in Liquid (PLAL), Silver Oxide nanoparticles (AgO) were prepared by using this technique, where silver target was submerged in ultra-pure water (UPW) at room temperature after that Nd:Yag laser which characteristics by 1064 nm wavelength, Q-switched, and 6ns pulse duration was used to irradiated silver target. This preparation method was used to study the effects of laser irradiation on Nanoparticles synthesized by used varying laser pulse energy 1000 mJ, 500 mJ, and 100 mJ, with 500 pulses each time on the particle size. Nanoparticles are characterized using XRD, SEM, AFM, and UV-Visible spectroscopy. All the structural peaks determined by the XRD
... Show MoreThe data preprocessing step is an important step in web usage mining because of the nature of log data, which are heterogeneous, unstructured, and noisy. Given the scalability and efficiency of algorithms in pattern discovery, a preprocessing step must be applied. In this study, the sequential methodologies utilized in the preprocessing of data from web server logs, with an emphasis on sub-phases, such as session identification, user identification, and data cleansing, are comprehensively evaluated and meticulously examined.
The estimation of the regular regression model requires several assumptions to be satisfied such as "linearity". One problem occurs by partitioning the regression curve into two (or more) parts and then joining them by threshold point(s). This situation is regarded as a linearity violation of regression. Therefore, the multiphase regression model is received increasing attention as an alternative approach which describes the changing of the behavior of the phenomenon through threshold point estimation. Maximum likelihood estimator "MLE" has been used in both model and threshold point estimations. However, MLE is not resistant against violations such as outliers' existence or in case of the heavy-tailed error distribution. The main goal of t
... Show MoreTourism plays an important role in Malaysia’s economic development as it can boost business opportunity in its surrounding economic. By apply data mining on tourism data for predicting the area of business opportunity is a good choice. Data mining is the process that takes data as input and produces outputs knowledge. Due to the population of travelling in Asia country has increased in these few years. Many entrepreneurs start their owns business but there are some problems such as wrongly invest in the business fields and bad services quality which affected their business income. The objective of this paper is to use data mining technology to meet the business needs and customer needs of tourism enterprises and find the most effective
... Show More