In the latest years there has been a profound evolution in computer science and technology, which incorporated several fields. Under this evolution, Content Base Image Retrieval (CBIR) is among the image processing field. There are several image retrieval methods that can easily extract feature as a result of the image retrieval methods’ progresses. To the researchers, finding resourceful image retrieval devices has therefore become an extensive area of concern. Image retrieval technique refers to a system used to search and retrieve images from digital images’ huge database. In this paper, the author focuses on recommendation of a fresh method for retrieving image. For multi presentation of image in Convolutional Neural Network (CNN), Convolutional Neural Network-Slanlet Transform (CNN-SLT) model uses Slanlet Transform (SLT). The CBIR system was therefore inspected and the outcomes benchmarked. The results clearly illustrate that generally, the recommended technique outdid the rest with accuracy of 89 percent out of the three datasets that were applied in our experiments. This remarkable performance clearly illustrated that the CNN-SLT method worked well for all three datasets, where the previous phase (CNN) and the successive phase (CNN-SLT) harmoniously worked together.
Prenatal markers are commonly used in practice to screen for some foetal abnormalities. They can be biochemical or ultrasonic markers in addition to the newly used cell free Deoxyribonucleic Acid (DNA) estimation. This review aimed to illustrate the applications of the prenatal screening, and the reliability of these tests in detecting the presence of abnormal chromosomes such as trisomy-21, trisomy-18, and trisomy-13 in addition to neural tube defects. Prenatal markers can also be used in the anticipation of some obstetrical complications depending on levels of these markers in the mother’s circulation. In the developed countries, prenatal screening tests are regularly used during antenatal care period. Neural tube defects, numer
... Show MoreProductivity estimating of ready mixed concrete batch plant is an essential tool for the successful completion of the construction process. It is defined as the output of the system per unit of time. Usually, the actual productivity values of construction equipment in the site are not consistent with the nominal ones. Therefore, it is necessary to make a comprehensive evaluation of the nominal productivity of equipment concerning the effected factors and then re-evaluate them according to the actual values.
In this paper, the forecasting system was employed is an Artificial Intelligence technique (AI). It is represented by Artificial Neural Network (ANN) to establish the predicted model to estimate wet ready mixe
... Show MoreAbstract
This study investigated the optimization of wear behavior of AISI 4340 steel based on the Taguchi method under various testing conditions. In this paper, a neural network and the Taguchi design method have been implemented for minimizing the wear rate in 4340 steel. A back-propagation neural network (BPNN) was developed to predict the wear rate. In the development of a predictive model, wear parameters like sliding speed, applying load and sliding distance were considered as the input model variables of the AISI 4340 steel. An analysis of variance (ANOVA) was used to determine the significant parameter affecting the wear rate. Finally, the Taguchi approach was applied to determine
... Show MoreIn this paper, we used four classification methods to classify objects and compareamong these methods, these are K Nearest Neighbor's (KNN), Stochastic Gradient Descentlearning (SGD), Logistic Regression Algorithm(LR), and Multi-Layer Perceptron (MLP). Weused MCOCO dataset for classification and detection the objects, these dataset image wererandomly divided into training and testing datasets at a ratio of 7:3, respectively. In randomlyselect training and testing dataset images, converted the color images to the gray level, thenenhancement these gray images using the histogram equalization method, resize (20 x 20) fordataset image. Principal component analysis (PCA) was used for feature extraction, andfinally apply four classification metho
... Show MoreThe aim of this research was to analyze the financial reporting requirements of segmental information that stipulated by the Iraqi accounting rules, investigating the extent of it compliance with the requirements of the International Financial Reporting Standard No.8 (IFRS 8) and the Statement of Financial Standards No.131 (SFAS 131). Also the research aimed to identify the segmental disclosure practices in listed corporations on Iraq Stock Exchange (ISX), basing on a hypotheses said that “the insufficient of Iraqi financial reporting requirements of segmental information affect<
... Show More<span>Dust is a common cause of health risks and also a cause of climate change, one of the most threatening problems to humans. In the recent decade, climate change in Iraq, typified by increased droughts and deserts, has generated numerous environmental issues. This study forecasts dust in five central Iraqi districts using machine learning and five regression algorithm supervised learning system framework. It was assessed using an Iraqi meteorological organization and seismology (IMOS) dataset. Simulation results show that the gradient boosting regressor (GBR) has a mean square error of 8.345 and a total accuracy ratio of 91.65%. Moreover, the results show that the decision tree (DT), where the mean square error is 8.965, c
... Show MoreCurrently, one of the topical areas of application of machine learning methods is the prediction of material characteristics. The aim of this work is to develop machine learning models for determining the rheological properties of polymers from experimental stress relaxation curves. The paper presents an overview of the main directions of metaheuristic approaches (local search, evolutionary algorithms) to solving combinatorial optimization problems. Metaheuristic algorithms for solving some important combinatorial optimization problems are described, with special emphasis on the construction of decision trees. A comparative analysis of algorithms for solving the regression problem in CatBoost Regressor has been carried out. The object of
... Show MoreThrough recent years many researchers have developed methods to estimate the self-similarity and long memory parameter that is best known as the Hurst parameter. In this paper, we set a comparison between nine different methods. Most of them use the deviations slope to find an estimate for the Hurst parameter like Rescaled range (R/S), Aggregate Variance (AV), and Absolute moments (AM), and some depend on filtration technique like Discrete Variations (DV), Variance versus level using wavelets (VVL) and Second-order discrete derivative using wavelets (SODDW) were the comparison set by a simulation study to find the most efficient method through MASE. The results of simulation experiments were shown that the performance of the meth
... Show More
A novel median filter based on crow optimization algorithms (OMF) is suggested to reduce the random salt and pepper noise and improve the quality of the RGB-colored and gray images. The fundamental idea of the approach is that first, the crow optimization algorithm detects noise pixels, and that replacing them with an optimum median value depending on a criterion of maximization fitness function. Finally, the standard measure peak signal-to-noise ratio (PSNR), Structural Similarity, absolute square error and mean square error have been used to test the performance of suggested filters (original and improved median filter) used to removed noise from images. It achieves the simulation based on MATLAB R2019b and the resul
... Show More