Facial recognition has been an active field of imaging science. With the recent progresses in computer vision development, it is extensively applied in various areas, especially in law enforcement and security. Human face is a viable biometric that could be effectively used in both identification and verification. Thus far, regardless of a facial model and relevant metrics employed, its main shortcoming is that it requires a facial image, against which comparison is made. Therefore, closed circuit televisions and a facial database are always needed in an operational system. For the last few decades, unfortunately, we have experienced an emergence of asymmetric warfare, where acts of terrorism are often committed in secluded area with no camera installed and possibly by persons whose photos have never been kept in any official database prior to the event. During subsequent investigations, the authorities thus had to rely on traumatized and frustrated witnesses, whose testimonial accounts regarding suspect’s appearance are dubious and often misleading. To address this issue, this paper presents an application of a statistical appearance model of human face in assisting suspect identification based on witness’s visual recollection. An online prototype system was implemented to demonstrate its core functionalities. Both visual and numerical assessments reported herein evidentially indicated potential benefits of the system for the intended purpose.
Researchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat
... Show MoreThe aim of this research is to measure the effect of Adey- Shire model in the achievement and critical thinking of first intermediate female students in mathematics. The researcher adopted the experimental method with a post-test, the research of sample consists of (60) female students, divided into two groups with (30) students in the experimental group, that studied with Adey- Shire model, and (30) students in the control group who studied in the usual way. The two groups are equivalent in many variables. The researcher makes two tests of multiple choices, the first one is an achievement test consists (30) items and another test was for a critical thinking test with (25) items. The statistical analysis make to both tests is made with s
... Show MoreIn this work, electron number density calculated using Matlab program code with the writing algorithm of the program. Electron density was calculated using Anisimov model in a vacuum environment. The effect of spatial coordinates on the electron density was investigated in this study. It was found that the Z axis distance direction affects the electron number density (ne). There are many processes such as excitation; ionization and recombination within the plasma that possible affect the density of electrons. The results show that as Z axis distance increases electron number density decreases because of the recombination of electrons and ions at large distances from the target and the loss of thermal energy of the electrons in
... Show MoreIn this research , we study the inverse Gompertz distribution (IG) and estimate the survival function of the distribution , and the survival function was evaluated using three methods (the Maximum likelihood, least squares, and percentiles estimators) and choosing the best method estimation ,as it was found that the best method for estimating the survival function is the squares-least method because it has the lowest IMSE and for all sample sizes
Data Driven Requirement Engineering (DDRE) represents a vision for a shift from the static traditional methods of doing requirements engineering to dynamic data-driven user-centered methods. Data available and the increasingly complex requirements of system software whose functions can adapt to changing needs to gain the trust of its users, an approach is needed in a continuous software engineering process. This need drives the emergence of new challenges in the discipline of requirements engineering to meet the required changes. The problem in this study was the method in data discrepancies which resulted in the needs elicitation process being hampered and in the end software development found discrepancies and could not meet the need
... Show MoreCompressing the speech reduces the data storage requirements, leading to reducing the time of transmitting the digitized speech over long-haul links like internet. To obtain best performance in speech compression, wavelet transforms require filters that combine a number of desirable properties, such as orthogonality and symmetry.The MCT bases functions are derived from GHM bases function using 2D linear convolution .The fast computation algorithm methods introduced here added desirable features to the current transform. We further assess the performance of the MCT in speech compression application. This paper discusses the effect of using DWT and MCT (one and two dimension) on speech compression. DWT and MCT performances in terms of comp
... Show MoreEstimation of the unknown parameters in 2-D sinusoidal signal model can be considered as important and difficult problem. Due to the difficulty to find estimate of all the parameters of this type of models at the same time, we propose sequential non-liner least squares method and sequential robust M method after their development through the use of sequential approach in the estimate suggested by Prasad et al to estimate unknown frequencies and amplitudes for the 2-D sinusoidal compounds but depending on Downhill Simplex Algorithm in solving non-linear equations for the purpose of obtaining non-linear parameters estimation which represents frequencies and then use of least squares formula to estimate
... Show MoreThe developed financial system is essential for increasing economic growth and poverty reduction in the world. The financial development helps in poverty reduction indirectly via intermediate channel which is the economic growth. The financial development enhancing economic development through mobilization of savings and channel them to the most efficient uses with higher economic and social returns. In addition, the economic growth reduces the poverty through two channels. The first is direct by increasing the introduction factors held by poor and improve the situations into the sectors and areas where the poor live. The second is indirect through redistribution the realized incomes from the economic growth as well as the realiz
... Show MoreIn this work, functionally graded materials were synthesized by centrifugal technique at different
volume fractions 0.5, 1, 1.5, and 2% Vf with a rotation speed of 1200 rpm and a constant rotation time, T
= 6 min . The mechanical properties were characterized to study the graded and non-graded nanocomposites
and the pure epoxy material. The mechanical tests showed that graded and non-graded added alumina
(Al2O3) nanoparticles enhanced the effect more than pure epoxy. The maximum difference in impact strength
occurred at (FGM), which was loaded from the rich side of the nano-alumina where the maximum value was
at 1% Vf by 133.33% of the sample epoxy side. The flexural strength and Young modulus of the fu