3D models delivered from digital photogrammetric techniques have massively increased and developed to meet the requirements of many applications. The reliability of these models is basically dependent on the data processing cycle and the adopted tool solution in addition to data quality. Agisoft PhotoScan is a professional image-based 3D modelling software, which seeks to create orderly, precise n 3D content from fixed images. It works with arbitrary images those qualified in both controlled and uncontrolled conditions. Following the recommendations of many users all around the globe, Agisoft PhotoScan, has become an important source to generate precise 3D data for different applications. How reliable is this data for accurate 3D modelling applications is the current question that needs an answer. Therefore; in this paper, the performance of the Agisoft PhotoScan software was assessed and analyzed to show the potential of the software for accurate 3D modelling applications. To investigate this, a study was carried out in the University of Baghdad / Al-Jaderia campus using data collected from airborne metric camera with 457m flying height. The Agisoft results show potential according to the research objective and the dataset quality following statistical and validation shape analysis.
The study relied on data about the health sector in Iraq in 2006 in cooperation with the Ministry of Health and the Central Bureau of Statistics and Information Technology in 2007 Included the estimates of the population distribution of the Baghdad province and the country depending on the population distribution for 1997,evaluate the health sector which included health institutions, and health staff, and other health services. The research Aimis; Measurement an amount and size of the growth of health services (increase and decrease) and the compare of verified in Iraq and Baghdad, and evaluate the effectiveness of the distribution of supplies and health services (physical and human) of the size of the population distribution and
... Show MoreA new distribution, the Epsilon Skew Gamma (ESΓ ) distribution, which was first introduced by Abdulah [1], is used on a near Gamma data. We first redefine the ESΓ distribution, its properties, and characteristics, and then we estimate its parameters using the maximum likelihood and moment estimators. We finally use these estimators to fit the data with the ESΓ distribution
Titanium alloys are broadly used in the medical and aerospace sectors. However, they are categorized within the hard-to-machine alloys ascribed to their higher chemical reactivity and lower thermal conductivity. This aim of this research was to study the impact of the dry-end-milling process with an uncoated tool on the produced surface roughness of Ti6Al4V alloy. This research aims to study the impact of the dry-end milling process with an uncoated tool on the produced surface roughness of Ti6Al4V alloy. Also, it seeks to develop a new hybrid neural model based on the training back propagation neural network (BPNN) with swarm optimization-gravitation search hybrid algorithms (PSO-GS
This paper presents a three-dimensional Dynamic analysis of a rockfill dam with different foundation depths by considering the dam connection with both the reservoir bed and water. ANSYS was used to develop the three-dimensional Finite Element (FE) model of the rockfill dam. The essential objective of this study is the discussion of the effects of different foundation depths on the Dynamic behaviour of an embanked dam. Four foundation depths were investigated. They are the dam without foundation (fixed base), and three different depths of the foundation. Taking into consideration the changing of upstream water level, the empty, minimum, and maximum water levels, the results of the three-dimensional F
This study aimed at indicators of technical analysis and their impact on a group of trading stock indices related to it, by standing on the methods used in technical analysis and its various models, diagnosing the obstacles and difficulties that the participants face in predicting stock prices, and proposing solutions and recommendations to overcome and overcome them. From a scientific and practical perspective.
Where the research community consisted of (25) Iraqi private commercial banks, while the research sample consisted of (3) banks with a percentage of (12%) of the research community. The study used the analytical approach to the financial statements during the period between (1/2/ 2022-30/4/2022),
... Show MoreThe data preprocessing step is an important step in web usage mining because of the nature of log data, which are heterogeneous, unstructured, and noisy. Given the scalability and efficiency of algorithms in pattern discovery, a preprocessing step must be applied. In this study, the sequential methodologies utilized in the preprocessing of data from web server logs, with an emphasis on sub-phases, such as session identification, user identification, and data cleansing, are comprehensively evaluated and meticulously examined.
A skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.
The distribution of the intensity of the comet Ison C/2013 is studied by taking its histogram. This distribution reveals four distinct regions that related to the background, tail, coma and nucleus. One dimensional temperature distribution fitting is achieved by using two mathematical equations that related to the coordinate of the center of the comet. The quiver plot of the gradient of the comet shows very clearly that arrows headed towards the maximum intensity of the comet.
Big data of different types, such as texts and images, are rapidly generated from the internet and other applications. Dealing with this data using traditional methods is not practical since it is available in various sizes, types, and processing speed requirements. Therefore, data analytics has become an important tool because only meaningful information is analyzed and extracted, which makes it essential for big data applications to analyze and extract useful information. This paper presents several innovative methods that use data analytics techniques to improve the analysis process and data management. Furthermore, this paper discusses how the revolution of data analytics based on artificial intelligence algorithms might provide
... Show More