An image retrieval system is a computer system for browsing, looking and recovering pictures from a huge database of advanced pictures. The objective of Content-Based Image Retrieval (CBIR) methods is essentially to extract, from large (image) databases, a specified number of images similar in visual and semantic content to a so-called query image. The researchers were developing a new mechanism to retrieval systems which is mainly based on two procedures. The first procedure relies on extract the statistical feature of both original, traditional image by using the histogram and statistical characteristics (mean, standard deviation). The second procedure relies on the T- test to measure the independence between more than images, (coefficient of correlate, T- test, Level of significance, find the decision), and, through experimental test, it was found that this proposed method of retrieval technique is powerful than the classical retrieval System.
Segmentation is the process of partition digital images into different parts depending on texture, color, or intensity, and can be used in different fields in order to segment and isolate the area to be partitioned. In this work images of the Moon obtained through observations in Astronomy and space dep. College of science university of Baghdad by ( Toward space telescopes and widespread used of a CCD camera) . Different segmentation methods were used to segment lunar craters. Different celestial objects cause craters when they crash into the surface of the Moon like asteroids and meteorites. Thousands of craters appears on the Moon's surface with ranges in size from meter to many kilometers, it provide insights into the age and ge
... Show MoreThe research dealt with a comparative study between some semi-parametric estimation methods to the Partial linear Single Index Model using simulation. There are two approaches to model estimation two-stage procedure and MADE to estimate this model. Simulations were used to study the finite sample performance of estimating methods based on different Single Index models, error variances, and different sample sizes , and the mean average squared errors were used as a comparison criterion between the methods were used. The results showed a preference for the two-stage procedure depending on all the cases that were used
CuO nanoparticles were synthesized in two different ways, firstly by precipitation method using copper acetate monohydrate Cu(CO2CH13)2·H2O, glacial acetic acid (CH3COOH) and sodium hydroxide(NaOH), and secondly by sol-gel method using copper chloride(CuCl2), sodium hydroxide (NaOH) and ethanol (C2H6O). Results of scanning electron microscopy (SEM) showed that different CuO nanostructures (spherical and Reef) can be formed using precipitation and sol- gel process, respectively, at which the particle size was found to be less than 2 µm. X-ray diffraction (XRD)manifested that the pure synthesized powder has no inclusions that may exist during preparations. XRD result
... Show MoreColloidal crystals (opals) made of close-packed polymethylmethacrylate (PMMA) were fabricated and grown by Template-Directed methods to obtain porous materials with well-ordered periodicity and interconnected pore systems to manufacture photonic crystals. Opals were made from aqueous suspensions of monodisperse PMMA spheres with diameters between 280 and 415 nm. SEM confirmed the PMMA spheres crystallized uniformly in a face-centered cubic (FCC) array. Optical properties of synthesized pores PMMA were characterized by UV–Visible spectroscopy. It shows that the colloidal crystals possess pseudo photonic band gaps in the visible region. A combination of Bragg’s law of diffraction and Snell’s law of refraction were used to calculate t
... Show MoreOptimizing system performance in dynamic and heterogeneous environments and the efficient management of computational tasks are crucial. This paper therefore looks at task scheduling and resource allocation algorithms in some depth. The work evaluates five algorithms: Genetic Algorithms (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Firefly Algorithm (FA) and Simulated Annealing (SA) across various workloads achieved by varying the task-to-node ratio. The paper identifies Finish Time and Deadline as two key performance metrics for gauging the efficacy of an algorithm, and a comprehensive investigation of the behaviors of these algorithms across different workloads was carried out. Results from the experiment
... Show MoreBecause of the experience of the mixture problem of high correlation and the existence of linear MultiCollinearity between the explanatory variables, because of the constraint of the unit and the interactions between them in the model, which increases the existence of links between the explanatory variables and this is illustrated by the variance inflation vector (VIF), L-Pseudo component to reduce the bond between the components of the mixture.
To estimate the parameters of the mixture model, we used in our research the use of methods that increase bias and reduce variance, such as the Ridge Regression Method and the Least Absolute Shrinkage and Selection Operator (LASSO) method a
... Show MoreThe advancement of digital technology has increased the deployment of wireless sensor networks (WSNs) in our daily life. However, locating sensor nodes is a challenging task in WSNs. Sensing data without an accurate location is worthless, especially in critical applications. The pioneering technique in range-free localization schemes is a sequential Monte Carlo (SMC) method, which utilizes network connectivity to estimate sensor location without additional hardware. This study presents a comprehensive survey of state-of-the-art SMC localization schemes. We present the schemes as a thematic taxonomy of localization operation in SMC. Moreover, the critical characteristics of each existing scheme are analyzed to identify its advantages
... Show More