Eye Detection is used in many applications like pattern recognition, biometric, surveillance system and many other systems. In this paper, a new method is presented to detect and extract the overall shape of one eye from image depending on two principles Helmholtz & Gestalt. According to the principle of perception by Helmholz, any observed geometric shape is perceptually "meaningful" if its repetition number is very small in image with random distribution. To achieve this goal, Gestalt Principle states that humans see things either through grouping its similar elements or recognize patterns. In general, according to Gestalt Principle, humans see things through general description of these things. This paper utilizes these two principles to recognize and extract eye part from image. Java programming language and OpenCV library for image processing are used for this purpose. Good results are obtained from this proposed method, where 88.89% was obtained as a detection rate taking into account that the average execution time is about 0.23 in seconds.
In the present work, a kinetic study was performed to the extraction of phosphate from Iraqi Akashat phosphate ore using organic acid. Leaching was studied using lactic acid for the separation of calcareous materials (mainly calcite). Reaction conditions were 2% by weight acid concentration and 5ml/gm of acid volume to ore weight ratio. Reaction time was taken in the range 2 to 30 minutes (step 2 minutes) to determine the reaction rate constant k based on the change in calcite concentration. To determine value of activation energy when reaction temperature is varied from 25 to 65 , another investigation was accomplished. Through the kinetic data, it was found that selective leaching was controlled by surface chemical reactio
... Show MoreBig data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show MoreThe hydraulic conditions of a flow previously proved to be changed when placing large-scale geometric roughness elements on the bed of an open channel. These elements impose more resistance to the flow. The geometry of the roughness elements, the numbers used, and the configuration are parameters that can affect the hydraulic flow characteristics. The target is to use inclined block elements to control the salt wedge propagation pointed in most estuaries to prevent its negative effects. The Computational Fluid Dynamics CFD Software was used to simulate the two-phase flow in an estuary model. In this model, the used block elements are 2 cm by 3 cm cross-sections with an inclined face in the flow direction, with a length
... Show MoreData Driven Requirement Engineering (DDRE) represents a vision for a shift from the static traditional methods of doing requirements engineering to dynamic data-driven user-centered methods. Data available and the increasingly complex requirements of system software whose functions can adapt to changing needs to gain the trust of its users, an approach is needed in a continuous software engineering process. This need drives the emergence of new challenges in the discipline of requirements engineering to meet the required changes. The problem in this study was the method in data discrepancies which resulted in the needs elicitation process being hampered and in the end software development found discrepancies and could not meet the need
... Show MoreThe transportation model is a well-recognized and applied algorithm in the distribution of products of logistics operations in enterprises. Multiple forms of solution are algorithmic and technological, which are applied to determine the optimal allocation of one type of product. In this research, the general formulation of the transport model by means of linear programming, where the optimal solution is integrated for different types of related products, and through a digital, dynamic, easy illustration Develops understanding of the Computer in Excel QM program. When choosing, the implementation of the form in the organization is provided.
In this paper, the process of comparison between the tree regression model and the negative binomial regression. As these models included two types of statistical methods represented by the first type "non parameter statistic" which is the tree regression that aims to divide the data set into subgroups, and the second type is the "parameter statistic" of negative binomial regression, which is usually used when dealing with medical data, especially when dealing with large sample sizes. Comparison of these methods according to the average mean squares error (MSE) and using the simulation of the experiment and taking different sample
... Show MoreImage compression is a suitable technique to reduce the storage space of an image, increase the area of storage in the device, and speed up the transmission process. In this paper, a new idea for image compression is proposed to improve the performance of the Absolute Moment Block Truncation Coding (AMBTC) method depending on Weber's law condition to distinguish uniform blocks (i.e., low and constant details blocks) from non-uniform blocks in original images. Then, all elements in the bitmap of each uniform block are represented by zero. After that, the lossless method, which is Run Length method, is used for compressing the bits more, which represent the bitmap of these uniform blocks. Via this simple idea, the result is improving
... Show More