Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the original image. A lossless Hexadata encoding method is then applied to the data, which is based on reducing each set of six data items to a single encoded value. The tested results achieved acceptable saving bytes performance for the 21 iris square images of sizes 256x256 pixels which is about 22.4 KB on average with 0.79 sec decompression average time, with high saving bytes performance for 2 iris non-square images of sizes 640x480/2048x1536 that reached 76KB/2.2 sec, 1630 KB/4.71 sec respectively, Finally, the proposed promising techniques standard lossless JPEG2000 compression techniques with reduction about 1.2 and more in KB saving that implicitly demonstrating the power and efficiency of the suggested lossless biometric techniques.
The physical substance at high energy level with specific circumstances; tend to behave harsh and complicated, meanwhile, sustaining equilibrium or non-equilibrium thermodynamic of the system. Measurement of the temperature by ordinary techniques in these cases is not applicable at all. Likewise, there is a need to apply mathematical models in numerous critical applications to measure the temperature accurately at an atomic level of the matter. Those mathematical models follow statistical rules with different distribution approaches of quantities energy of the system. However, these approaches have functional effects at microscopic and macroscopic levels of that system. Therefore, this research study represents an innovative of a wi
... Show MoreThe paper aims to propose Teaching Learning based Optimization (TLBO) algorithm to solve 3-D packing problem in containers. The objective which can be presented in a mathematical model is optimizing the space usage in a container. Besides the interaction effect between students and teacher, this algorithm also observes the learning process between students in the classroom which does not need any control parameters. Thus, TLBO provides the teachers phase and students phase as its main updating process to find the best solution. More precisely, to validate the algorithm effectiveness, it was implemented in three sample cases. There was small data which had 5 size-types of items with 12 units, medium data which had 10 size-types of items w
... Show MoreThe researchers of the present study have conducted a genre analysis of two political debates between American presidential nominees in the 2016 and 2020 elections. The current study seeks to analyze the cognitive construction of political debates to evaluate the typical moves and strategies politicians use to express their communicative intentions and to reveal the language manifestations of those moves and strategies. To achieve the study’s aims, the researchers adopt Bhatia’s (1993) framework of cognitive construction supported by van Emeren’s (2010) pragma-dialectic framework. The study demonstrates that both presidents adhere to this genre structuring to further their political agendas. For a positive and promising image
... Show MoreRouting protocols are responsible for providing reliable communication between the source and destination nodes. The performance of these protocols in the ad hoc network family is influenced by several factors such as mobility model, traffic load, transmission range, and the number of mobile nodes which represents a great issue. Several simulation studies have explored routing protocol with performance parameters, but few relate to various protocols concerning routing and Quality of Service (QoS) metrics. This paper presents a simulation-based comparison of proactive, reactive, and multipath routing protocols in mobile ad hoc networks (MANETs). Specifically, the performance of AODV, DSDV, and AOMDV protocols are evaluated and analyz
... Show MoreThe development of Web 2.0 has improved people's ability to share their opinions. These opinions serve as an important piece of knowledge for other reviewers. To figure out what the opinions is all about, an automatic system of analysis is needed. Aspect-based sentiment analysis is the most important research topic conducted to extract reviewers-opinions about certain attribute, for instance opinion-target (aspect). In aspect-based tasks, the identification of the implicit aspect such as aspects implicitly implied in a review, is the most challenging task to accomplish. However, this paper strives to identify the implicit aspects based on hierarchical algorithm incorporated with common-sense knowledge by means of dimensionality reduction.
major goal of the next-generation wireless communication systems is the development of a reliable high-speed wireless communication system that supports high user mobility. They must focus on increasing the link throughput and the network capacity. In this paper a novel, spectral efficient system is proposed for generating and transmitting twodimensional (2-D) orthogonal frequency division multiplexing (OFDM) symbols through 2- D inter-symbol interference (ISI) channel. Instead of conventional data mapping techniques, discrete finite Radon transform (FRAT) is used as a data mapping technique due to the increased orthogonality offered. As a result, the proposed structure gives a significant improvement in bit error rate (BER) performance. Th
... Show MoreCopula modeling is widely used in modern statistics. The boundary bias problem is one of the problems faced when estimating by nonparametric methods, as kernel estimators are the most common in nonparametric estimation. In this paper, the copula density function was estimated using the probit transformation nonparametric method in order to get rid of the boundary bias problem that the kernel estimators suffer from. Using simulation for three nonparametric methods to estimate the copula density function and we proposed a new method that is better than the rest of the methods by five types of copulas with different sample sizes and different levels of correlation between the copula variables and the different parameters for the function. The
... Show More