Delays occur commonly in construction projects. Assessing the impact of delay is sometimes a contentious
issue. Several delay analysis methods are available but no one method can be universally used over another in
all situations. The selection of the proper analysis method depends upon a variety of factors including
information available, time of analysis, capabilities of the methodology, and time, funds and effort allocated to the analysis. This paper presents computerized schedule analysis programmed that use daily windows analysis method as it recognized one of the most credible methods, and it is one of the few techniques much more likely to be accepted by courts than any other method. A simple case study has been implemented to demonstrate the accuracy and usefulness of the proposed delay analysis model. The results of the study indicate that the outcomes of delay analyses are often not predictable that each method may yield different results. The study also revealed that depending on the time and resources available, and the accessibility of project control documentation, one method may be more practical or cost-effective.
To obtain the approximate solution to Riccati matrix differential equations, a new variational iteration approach was proposed, which is suggested to improve the accuracy and increase the convergence rate of the approximate solutons to the exact solution. This technique was found to give very accurate results in a few number of iterations. In this paper, the modified approaches were derived to give modified solutions of proposed and used and the convergence analysis to the exact solution of the derived sequence of approximate solutions is also stated and proved. Two examples were also solved, which shows the reliability and applicability of the proposed approach.
This investigation proposed an identification system of offline signature by utilizing rotation compensation depending on the features that were saved in the database. The proposed system contains five principle stages, they are: (1) data acquisition, (2) signature data file loading, (3) signature preprocessing, (4) feature extraction, and (5) feature matching. The feature extraction includes determination of the center point coordinates, and the angle for rotation compensation (θ), implementation of rotation compensation, determination of discriminating features and statistical condition. During this work seven essential collections of features are utilized to acquire the characteristics: (i) density (D), (ii) average (A), (iii) s
... Show MoreIn this work, electron number density calculated using Matlab program code with the writing algorithm of the program. Electron density was calculated using Anisimov model in a vacuum environment. The effect of spatial coordinates on the electron density was investigated in this study. It was found that the Z axis distance direction affects the electron number density (ne). There are many processes such as excitation; ionization and recombination within the plasma that possible affect the density of electrons. The results show that as Z axis distance increases electron number density decreases because of the recombination of electrons and ions at large distances from the target and the loss of thermal energy of the electrons in
... Show MoreIn the present work, a kinetic study was performed to the extraction of phosphate from Iraqi Akashat phosphate ore using organic acid. Leaching was studied using lactic acid for the separation of calcareous materials (mainly calcite). Reaction conditions were 2% by weight acid concentration and 5ml/gm of acid volume to ore weight ratio. Reaction time was taken in the range 2 to 30 minutes (step 2 minutes) to determine the reaction rate constant k based on the change in calcite concentration. To determine value of activation energy when reaction temperature is varied from 25 to 65 , another investigation was accomplished. Through the kinetic data, it was found that selective leaching was controlled by
... Show More<p>Analyzing X-rays and computed tomography-scan (CT scan) images using a convolutional neural network (CNN) method is a very interesting subject, especially after coronavirus disease 2019 (COVID-19) pandemic. In this paper, a study is made on 423 patients’ CT scan images from Al-Kadhimiya (Madenat Al Emammain Al Kadhmain) hospital in Baghdad, Iraq, to diagnose if they have COVID or not using CNN. The total data being tested has 15000 CT-scan images chosen in a specific way to give a correct diagnosis. The activation function used in this research is the wavelet function, which differs from CNN activation functions. The convolutional wavelet neural network (CWNN) model proposed in this paper is compared with regular convol
... Show MoreThe deep learning algorithm has recently achieved a lot of success, especially in the field of computer vision. This research aims to describe the classification method applied to the dataset of multiple types of images (Synthetic Aperture Radar (SAR) images and non-SAR images). In such a classification, transfer learning was used followed by fine-tuning methods. Besides, pre-trained architectures were used on the known image database ImageNet. The model VGG16 was indeed used as a feature extractor and a new classifier was trained based on extracted features.The input data mainly focused on the dataset consist of five classes including the SAR images class (houses) and the non-SAR images classes (Cats, Dogs, Horses, and Humans). The Conv
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreMolecular barcoding was widely recognized as a powerful tool for the identification of organisms during the past decade; the aim of this study is to use the molecular approach to identify the diatoms by using the environmental DNA. The diatom specimens were taken from Tigris River. The environmental DNA(e DNA) extraction and analysis of sequences using the Next Generation Sequencing (NGS) method showed the highest percentage of epipelic diatom genera including Achnanthidium minutissimum (Kützing) Czarnecki, 1994 (21.1%), Cocconeis placentula Ehrenberg, 1838 (21.3%) and Nitzschia palea (Kützing) W. Smith, 1856 (16.3%).
Five species of diatoms: Achnanthidiu
... Show More