Each project management system aims to complete the project within its identified objectives: budget, time, and quality. It is achieving the project within the defined deadline that required careful scheduling, that be attained early. Due to the nature of unique repetitive construction projects, time contingency and project uncertainty are necessary for accurate scheduling. It should be integrated and flexible to accommodate the changes without adversely affecting the construction project’s total completion time. Repetitive planning and scheduling methods are more effective and essential. However, they need continuous development because of the evolution of execution methods, essentially based on the repetitive construction projects’ composition of identical production units. This study develops a mathematical model to forecast repetitive construction projects using the Support Vector Machine (SVM) technique. The software (WEKA 3.9.1©2016) has been used in the process of developing the mathematical model. The number of factors affecting the planning and scheduling of the repetitive projects has been identified through a questionnaire that analyzed its results using SPSS V22 software. Three accuracy measurements, correlation coefficient (R), Root Mean Square Error (RMSE) and Mean Absolute Error (MAE), were used to check the mathematical model and to compare the actual values with predicted values. The results showed that the SVM technique was more precise than those calculated by the conventional methods and was found the best generalization with R 97 %, MAE 3.6 %, and RMSE 7 %.
The subject of research entitled "The Mechanisms of build up the dramatic construction in the films of the world of nature - National Geographic's films as a model" emerges from the importance of the subject of the dramatic construction and its departure from its classic style due to the evolution of the visual presentation and its instruments and the specificity and emergence of a form of television production represented by the films of the world of nature which began to occupy an important space in the map of television and television channels specialized in this subject, which drove the researcher to study the mechanisms of producing the dramatic construction in this kind of film s. This research came in three chapt
... Show MoreThe last few years witnessed great and increasing use in the field of medical image analysis. These tools helped the Radiologists and Doctors to consult while making a particular diagnosis. In this study, we used the relationship between statistical measurements, computer vision, and medical images, along with a logistic regression model to extract breast cancer imaging features. These features were used to tell the difference between the shape of a mass (Fibroid vs. Fatty) by looking at the regions of interest (ROI) of the mass. The final fit of the logistic regression model showed that the most important variables that clearly affect breast cancer shape images are Skewness, Kurtosis, Center of mass, and Angle, with an AUCROC of
... Show More<p><span>A Botnet is one of many attacks that can execute malicious tasks and develop continuously. Therefore, current research introduces a comparison framework, called BotDetectorFW, with classification and complexity improvements for the detection of Botnet attack using CICIDS2017 dataset. It is a free online dataset consist of several attacks with high-dimensions features. The process of feature selection is a significant step to obtain the least features by eliminating irrelated features and consequently reduces the detection time. This process implemented inside BotDetectorFW using two steps; data clustering and five distance measure formulas (cosine, dice, driver & kroeber, overlap, and pearson correlation
... Show MoreThe presented study investigated the scheduling regarding jobs on a single machine. Each job will be processed with no interruptions and becomes available for the processing at time 0. The aim is finding a processing order with regard to jobs, minimizing total completion time , total late work , and maximal tardiness which is an NP-hard problem. In the theoretical part of the present work, the mathematical formula for the examined problem will be presented, and a sub-problem of the original problem of minimizing the multi-objective functions is introduced. Also, then the importance regarding the dominance rule (DR) that could be applied to the problem to improve good solutions will be shown. While in the practical part, two
... Show MoreThe political situation experienced by Iraq before the events of 2003 that led to the collapse of infrastructure. rebuilding costs were estimated after 2003 by187(million USD) according to the estimates of the basic needs as stated in Five-Year Plan 2010-2014. The difficult in financing projects and the continuous demands for maintenance and operating cost, and working by contemporary styles in different countries, the strategic option is to adopt the government entering the private sector as a partner in the development process. Since public _private partnership (PPP's) is at a germinating stage of development in Iraq, it has been studied the critical success factors(CSF's) in the experiences of countries that have implemented the style
... Show MoreThe problem of research was to identify after the use of cost technology based on specifications in the validity of determining and measuring the costs of the implementation of contracting, by applying to al-Mansour General Construction Contracting Company as an appropriate alternative to the traditional costing system currently adopted, which is characterized by many shortcomings and weaknesses Which has been reflected in the validity and integrity of the calculations. To solve this problem, the research was based on the premise that: (The application of cost technology based on specifications will result in calculating the cost of the product according to the specification required by the customer, to meet his wishes properly and witho
... Show MoreThe recent advancements in security approaches have significantly increased the ability to identify and mitigate any type of threat or attack in any network infrastructure, such as a software-defined network (SDN), and protect the internet security architecture against a variety of threats or attacks. Machine learning (ML) and deep learning (DL) are among the most popular techniques for preventing distributed denial-of-service (DDoS) attacks on any kind of network. The objective of this systematic review is to identify, evaluate, and discuss new efforts on ML/DL-based DDoS attack detection strategies in SDN networks. To reach our objective, we conducted a systematic review in which we looked for publications that used ML/DL approach
... Show MoreThis study is unique in this field. It represents a mix of three branches of technology: photometry, spectroscopy, and image processing. The work treats the image by treating each pixel in the image based on its color, where the color means a specific wavelength on the RGB line; therefore, any image will have many wavelengths from all its pixels. The results of the study are specific and identify the elements on the nucleus’s surface of a comet, not only the details but also their mapping on the nucleus. The work considered 12 elements in two comets (Temple 1 and 67P/Churyumoy-Gerasimenko). The elements have strong emission lines in the visible range, which were recognized by our MATLAB program in the treatment of the image. The percen
... Show MoreShadow removal is crucial for robot and machine vision as the accuracy of object detection is greatly influenced by the uncertainty and ambiguity of the visual scene. In this paper, we introduce a new algorithm for shadow detection and removal based on different shapes, orientations, and spatial extents of Gaussian equations. Here, the contrast information of the visual scene is utilized for shadow detection and removal through five consecutive processing stages. In the first stage, contrast filtering is performed to obtain the contrast information of the image. The second stage involves a normalization process that suppresses noise and generates a balanced intensity at a specific position compared to the neighboring intensit
... Show More