3D models delivered from digital photogrammetric techniques have massively increased and developed to meet the requirements of many applications. The reliability of these models is basically dependent on the data processing cycle and the adopted tool solution in addition to data quality. Agisoft PhotoScan is a professional image-based 3D modelling software, which seeks to create orderly, precise n 3D content from fixed images. It works with arbitrary images those qualified in both controlled and uncontrolled conditions. Following the recommendations of many users all around the globe, Agisoft PhotoScan, has become an important source to generate precise 3D data for different applications. How reliable is this data for accurate 3D modelling applications is the current question that needs an answer. Therefore; in this paper, the performance of the Agisoft PhotoScan software was assessed and analyzed to show the potential of the software for accurate 3D modelling applications. To investigate this, a study was carried out in the University of Baghdad / Al-Jaderia campus using data collected from airborne metric camera with 457m flying height. The Agisoft results show potential according to the research objective and the dataset quality following statistical and validation shape analysis.
Aim: To evaluate the commercial pure titanium disks that structuring by laser in two design (dot and groove) each one with three different laser scan (5, 15 and 25) and comparing with titanium surface that not subjected to any surface structuring (control) through measuring the wettability test and surface roughness test. Materials and methods: Structuring on the surface of the commercial pure titanium (CP Ti) disks was performed via using fiber laser CNC machine in two design (dot and groove) in three different laser scans (5, 15 and 25), then the structuring disks analyzed with the control group by atomic force microscope and water contact angle test. Results: The results of this study showed that the surface roughness and the wettability
... Show MoreRecently, microalgae have become a promising source in the production of biofuel. However, the cost of production is still the main obstacle to develop of this type of source. Although there are many extensive studies on the requirements provided for the cultivation of the microalgae, the study of the process, via the variables that affect the cultivation of microalgae, being still one of the important tasks to improve the production of biofuel. The present article is a serious attempt to investigate of use commercial fertilizer NPK (20:20:20+TE N: P: K) as considered a cheap nutrient medium in growth Chlorella vulgaris by comparison with traditional nutrient (Chu.10 medium). In addition, the current study addresses effect of di
... Show MoreRecently, microalgae have become a promising source in the production of biofuel. However, the cost of production is still the main obstacle to develop of this type of source. Although there are many extensive studies on the requirements provided for the cultivation of the microalgae, the study of the process, via the variables that affect the cultivation of microalgae, being still one of the important tasks to improve the production of biofuel. The present article is a serious attempt to investigate of use commercial fertilizer NPK (20:20:20+TE N: P: K) as considered a cheap nutrient medium in growth Chlorella vulgaris by comparison with traditional nutrient (Chu.10 medium). In addition, the current study addresses effect of different spar
... Show MoreStudent performance may influence by several factors in all his study levels such as primary school, intermediate school and even in his college; some of these factors are psychological factors, social factors, and the factors which correlate with student environment.
In this paper we study some of these factors to discover their influence by using canonical correlation analysis to analyze the data. Many conclusions are discovered to help who focuses student performance or to make it pest in future.
This paper presents a three-dimensional Dynamic analysis of a rockfill dam with different foundation depths by considering the dam connection with both the reservoir bed and water. ANSYS was used to develop the three-dimensional Finite Element (FE) model of the rockfill dam. The essential objective of this study is the discussion of the effects of different foundation depths on the Dynamic behaviour of an embanked dam. Four foundation depths were investigated. They are the dam without foundation (fixed base), and three different depths of the foundation. Taking into consideration the changing of upstream water level, the empty, minimum, and maximum water levels, the results of the three-dimensional F
Most of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve B
... Show MoreIn data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.
In this study, we made a comparison between LASSO & SCAD methods, which are two special methods for dealing with models in partial quantile regression. (Nadaraya & Watson Kernel) was used to estimate the non-parametric part ;in addition, the rule of thumb method was used to estimate the smoothing bandwidth (h). Penalty methods proved to be efficient in estimating the regression coefficients, but the SCAD method according to the mean squared error criterion (MSE) was the best after estimating the missing data using the mean imputation method
This paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random enecryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.This paper present the fast and robust approach of English text encryption and decryption b
... Show MoreIn data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me