Drones are highly autonomous, remote‐controlled platforms capable of performing a variety of tasks in diverse environments. A digital twin (DT) is a virtual replica of a physical system. The integration of DT with drones gives the opportunity to manipulate the drone during a mission. In this paper, the architecture of DT is presented in order to explain how the physical environment can be represented. The techniques via which drones are collecting the necessary information for DT are compared as a next step to introduce the main methods that have been applied in DT progress by drones. The findings of this research indicated that the process of incorporating DTs into drones will result in the advancement of readings from all sensors, control code and intelligence. This can be executed on the DTs, remote control for the performance of complex tasks in a variety of application environments, and simulation on the DTs without having an effect on the actual drone. On the other hand, in order to develop three‐dimensional representations of structures and construction sites, a method known as photogrammetry is used to generate these models employing drones as aerial scanners. In spite of this, there are a number of technological and social‐political obstacles that should be taken in consideration. These challenges include the interoperability of different sensors, the creation of efficiently optimized data processing algorithms, and concerns over data privacy and security.
beef and chicken meat were used to get Sarcoplasim, the chicken Sarcoplasim were used to prepare antibody for it after injected in rabbit, the antiserums activity were 1/32 by determined with Immune double diffusion test, the self test refer to abele for some antiserums to detected with beef sarcoplasim, which it mean found same proteins be between beef and chicken meat, which it refer to difficult depended on this immune method to detect for cheat of chicken meat with beef, so the antibody for beef sarcoplasim were removed from serum by immune absorption step to produce specific serum against chicken sarcoplasim that it used in Immune double diffusion test to qualitative detect for cheat beef with 5% chicken meat or more at least, and the
... Show MoreThe researcher studied transportation problem because it's great importance in the country's economy. This paper which ware studied several ways to find a solution closely to the optimization, has applied these methods to the practical reality by taking one oil derivatives which is benzene product, where the first purpose of this study is, how we can reduce the total costs of transportation for product of petrol from warehouses in the province of Baghdad, to some stations in the Karsh district and Rusafa in the same province. Secondly, how can we address the Domandes of each station by required quantity which is depending on absorptive capacity of the warehouses (quantities supply), And through r
... Show MoreA mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the
... Show MoreEvolutionary algorithms (EAs), as global search methods, are proved to be more robust than their counterpart local heuristics for detecting protein complexes in protein-protein interaction (PPI) networks. Typically, the source of robustness of these EAs comes from their components and parameters. These components are solution representation, selection, crossover, and mutation. Unfortunately, almost all EA based complex detection methods suggested in the literature were designed with only canonical or traditional components. Further, topological structure of the protein network is the main information that is used in the design of almost all such components. The main contribution of this paper is to formulate a more robust E
... Show MoreA comparison of double informative and non- informative priors assumed for the parameter of Rayleigh distribution is considered. Three different sets of double priors are included, for a single unknown parameter of Rayleigh distribution. We have assumed three double priors: the square root inverted gamma (SRIG) - the natural conjugate family of priors distribution, the square root inverted gamma – the non-informative distribution, and the natural conjugate family of priors - the non-informative distribution as double priors .The data is generating form three cases from Rayleigh distribution for different samples sizes (small, medium, and large). And Bayes estimators for the parameter is derived under a squared erro
... Show MoreObjectives: The purpose of the study is to ascertain the relationship between the training program and the socio-demographic features of patients with peptic ulcers in order to assess the efficiency of the program on patients' nutritional habits.
Methodology: Between January 17 and October 30 of 2022, The Center of Gastrointestinal Medicine and Surgery at Al-Diwanyiah Teaching Hospital conducted "a quasi-experimental study". A non-probability sample of 30 patients for the case group and 30 patients for the control group was selected based on the study's criteria. The study instrument was divided into 4 sections: the first portion contained 7 questions about demographic information, the second sect
... Show MoreIn this study, we made a comparison between LASSO & SCAD methods, which are two special methods for dealing with models in partial quantile regression. (Nadaraya & Watson Kernel) was used to estimate the non-parametric part ;in addition, the rule of thumb method was used to estimate the smoothing bandwidth (h). Penalty methods proved to be efficient in estimating the regression coefficients, but the SCAD method according to the mean squared error criterion (MSE) was the best after estimating the missing data using the mean imputation method
Image classification is the process of finding common features in images from various classes and applying them to categorize and label them. The main problem of the image classification process is the abundance of images, the high complexity of the data, and the shortage of labeled data, presenting the key obstacles in image classification. The cornerstone of image classification is evaluating the convolutional features retrieved from deep learning models and training them with machine learning classifiers. This study proposes a new approach of “hybrid learning” by combining deep learning with machine learning for image classification based on convolutional feature extraction using the VGG-16 deep learning model and seven class
... Show MoreImage classification is the process of finding common features in images from various classes and applying them to categorize and label them. The main problem of the image classification process is the abundance of images, the high complexity of the data, and the shortage of labeled data, presenting the key obstacles in image classification. The cornerstone of image classification is evaluating the convolutional features retrieved from deep learning models and training them with machine learning classifiers. This study proposes a new approach of “hybrid learning” by combining deep learning with machine learning for image classification based on convolutional feature extraction using the VGG-16 deep learning model and seven class
... Show More