Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the original image. A lossless Hexadata encoding method is then applied to the data, which is based on reducing each set of six data items to a single encoded value. The tested results achieved acceptable saving bytes performance for the 21 iris square images of sizes 256x256 pixels which is about 22.4 KB on average with 0.79 sec decompression average time, with high saving bytes performance for 2 iris non-square images of sizes 640x480/2048x1536 that reached 76KB/2.2 sec, 1630 KB/4.71 sec respectively, Finally, the proposed promising techniques standard lossless JPEG2000 compression techniques with reduction about 1.2 and more in KB saving that implicitly demonstrating the power and efficiency of the suggested lossless biometric techniques.
In our research, we dealt with one of the most important issues of linguistic studies of the Holy Qur’an, which is the words that are close in meaning, which some believe are synonyms, but in the Arabic language they are not considered synonyms because there are subtle differences between them. Synonyms in the Arabic language are very few, rather rare, and in the Holy Qur’an they are completely non-existent. And how were these words, close in meaning, translated in the translation of the Holy Qur’an by Almir Kuliev into the Russian language.
A space X is named a πp – normal if for each closed set F and each π – closed set F’ in X with F ∩ F’ = ∅, there are p – open sets U and V of X with U ∩ V = ∅ whereas F ⊆ U and F’ ⊆ V. Our work studies and discusses a new kind of normality in generalized topological spaces. We define ϑπp – normal, ϑ–mildly normal, & ϑ–almost normal, ϑp– normal, & ϑ–mildly p–normal, & ϑ–almost p-normal and ϑπ-normal space, and we discuss some of their properties.
Long before the pandemic, labour force all over the world was facing the quest of incertitude, which is normal and inherent of the market, but the extent of this quest was shaped by the pace of acceleration of technological progress, which became exponential in the last ten years, from 2010 to 2020. Robotic process automation, work remote, computer science, electronic and communications, mechanical engineering, information technology digitalisation o public administration and so one are ones of the pillars of the future of work. Some authors even stated that without robotic process automation (RPA) included in technological processes, companies will not be able to sustain a competitive level on the market (Madakan et al, 2018). R
... Show MoreThroughout this paper R represents commutative ring with identity and M is a unitary left R-module. The purpose of this paper is to investigate some new results (up to our knowledge) on the concept of weak essential submodules which introduced by Muna A. Ahmed, where a submodule N of an R-module M is called weak essential, if N ? P ? (0) for each nonzero semiprime submodule P of M. In this paper we rewrite this definition in another formula. Some new definitions are introduced and various properties of weak essential submodules are considered.
Interval methods for verified integration of initial value problems (IVPs) for ODEs have been used for more than 40 years. For many classes of IVPs, these methods have the ability to compute guaranteed error bounds for the flow of an ODE, where traditional methods provide only approximations to a solution. Overestimation, however, is a potential drawback of verified methods. For some problems, the computed error bounds become overly pessimistic, or integration even breaks down. The dependency problem and the wrapping effect are particular sources of overestimations in interval computations. Berz (see [1]) and his co-workers have developed Taylor model methods, which extend interval arithmetic with symbolic computations. The latter is an ef
... Show MoreIn this paper is to introduce the concept of hyper AT-algebras is a generalization of AT-algebras and study a hyper structure AT-algebra and investigate some of its properties. “Also, hyper AT-subalgebras and hyper AT-ideal of hyper AT-algebras are studied. We study on the fuzzy theory of hyper AT-ideal of hyper AT-algebras hyper AT-algebra”. “We study homomorphism of hyper AT-algebras which are a common generalization of AT-algebras.
After studying the reality of application to occupational safety in new Iraqi building projects and sampling the situation wilt that in developed and neighboring countries, researcher found that there is a big gap in the level of safety application conditions, this indicates the need fora quick and clear reference for local engineers to use it on site for safety conditions in their projects . As a case study the monitors work the researcher studied a huge project in the United Arab Emirates.This project considered for safety requirements to highest grades. This case study may be far away from the projects in Iraq, but we hope to rise the Iraqi work level in the near future. After seeing the way of administration work and how they were ra
... Show MoreWe dealt with the nature of the points under the influence of periodic function chaotic functions associated functions chaotic and sufficient conditions to be a very chaotic functions Palace