Merging biometrics with cryptography has become more familiar and a great scientific field was born for researchers. Biometrics adds distinctive property to the security systems, due biometrics is unique and individual features for every person. In this study, a new method is presented for ciphering data based on fingerprint features. This research is done by addressing plaintext message based on positions of extracted minutiae from fingerprint into a generated random text file regardless the size of data. The proposed method can be explained in three scenarios. In the first scenario the message was used inside random text directly at positions of minutiae in the second scenario the message was encrypted with a choosen word before ciphering
... Show MoreThe research aims to explain the role of huge data analyzes in measuring quality costs in the Iraqi company for the production of seed, and the research problem was diagnosed with the weakness of the approved method to measure quality costs, and the weak traditional systems of data analyzes, the researcher in the theoretical aspect relied on collecting sources and previous studies, as well as Adoption of the applied analytical approach in the practical aspect, as a set of financial analyzes were applied within the measurement of quality costs and a statement of the role of data analyzes in the practical side, the research concluded to a set of conc
... Show MoreThe main objective of this paper is to develop and validate flow injection method, a precise, accurate, simple, economic, low cost and specific turbidimetric method for the quantitative determination of mebeverine hydrochloride (MbH) in pharmaceutical preparations. A homemade NAG Dual & Solo (0-180º) analyser which contains two identical detections units (cell 1 and 2) was applied for turbidity measurements. The developed method was optimized for different chemical and physical parameters such as perception reagent concentrations, aqueous salts solutions, flow rate, the intensity of the sources light, sample volume, mixing coil and purge time. The correlation coefficients (r) of the developed method were 0.9980 and 0.9986 for cell
... Show MoreMultiple eliminations (de-multiple) are one of seismic processing steps to remove their effects and delineate the correct primary refractors. Using normal move out to flatten primaries is the way to eliminate multiples through transforming these data to frequency-wavenumber domain. The flatten primaries are aligned with zero axis of the frequency-wavenumber domain and any other reflection types (multiples and random noise) are distributed elsewhere. Dip-filter is applied to pass the aligned data and reject others will separate primaries from multiple after transforming the data back from frequency-wavenumber domain to time-distance domain. For that, a suggested name for this technique as normal move out- frequency-wavenumber domain
... Show MoreHuman skin detection, which usually performed before image processing, is the method of discovering skin-colored pixels and regions that may be of human faces or limbs in videos or photos. Many computer vision approaches have been developed for skin detection. A skin detector usually transforms a given pixel into a suitable color space and then uses a skin classifier to mark the pixel as a skin or a non-skin pixel. A skin classifier explains the decision boundary of the class of a skin color in the color space based on skin-colored pixels. The purpose of this research is to build a skin detection system that will distinguish between skin and non-skin pixels in colored still pictures. This performed by introducing a metric that measu
... Show MoreThis article studies a comprehensive methods of edge detection and algorithms in digital images which is reflected a basic process in the field of image processing and analysis. The purpose of edge detection technique is discovering the borders that distinct diverse areas of an image, which donates to refining the understanding of the image contents and extracting structural information. The article starts by clarifying the idea of an edge and its importance in image analysis and studying the most noticeable edge detection methods utilized in this field, (e.g. Sobel, Prewitt, and Canny filters), besides other schemes based on distinguishing unexpected modifications in light intensity and color gradation. The research as well discuss
... Show MoreThis study was done to evaluate a new technique to determine the presence of methamphetamine in the hair using nano bentonite-based adsorbent as the filler of extraction column. The state of the art of this study was based on the presence of silica in the nano bentonite that was assumed can interact with methamphetamine. The hair used was treated using methanol to extract the presence of methamphetamine, then it was continued by sonicating the hair sample. Qualitative analysis using Marquish reagent was performed to confirm the presence of methamphetamine in the isolate.The hair sample that has been taken in a different period confirmed that this current developing method can be used to analyzed methamphetamine. This m
... Show MoreBeyond the immediate content of speech, the voice can provide rich information about a speaker's demographics, including age and gender. Estimating a speaker's age and gender offers a wide range of applications, spanning from voice forensic analysis to personalized advertising, healthcare monitoring, and human-computer interaction. However, pinpointing precise age remains intricate due to age ambiguity. Specifically, utterances from individuals at adjacent ages are frequently indistinguishable. Addressing this, we propose a novel, end-to-end approach that deploys Mozilla's Common Voice dataset to transform raw audio into high-quality feature representations using Wav2Vec2.0 embeddings. These are then channeled into our self-attentio
... Show More