The demand for single photon sources in quantum key distribution (QKD) systems has necessitated the use of weak coherent pulses (WCPs) characterized by a Poissonian distribution. Ensuring security against eavesdropping attacks requires keeping the mean photon number (µ) small and known to legitimate partners. However, accurately determining µ poses challenges due to discrepancies between theoretical calculations and practical implementation. This paper introduces two experiments. The first experiment involves theoretical calculations of µ using several filters to generate the WCPs. The second experiment utilizes a variable attenuator to generate the WCPs, and the value of µ was estimated from the photons detected by the BB84 detection setup. The second experiment represents an accurate method for estimating the value of µ because of using single photon detectors with high timing resolution and low dark counts, in addition to using a Time-to-digital convertor with a bin size of 81 ps.
Incremental sheet forming (ISF) is a metal forming technology in which small incremental deformations determine the final shape. The sheet is deformed by a hemispherical tool that follows the required shape contour to deform the sheet into the desired geometry. In this study, single point incremental sheet forming (SPIF) has been implemented in dentistry to manufacture a denture plate using two types of stainless steel, 304 and 316L, with an initial thickness of 0.5mm and 0.8mm, respectively. Stainless steel was selected due to its biocompatibility and reasonable cost. A three-dimensional (3D) analysis procedure was conducted to evaluate the manufactured part's geometrical accuracy and thickness distribution. The obtained results confirm
... Show MoreBackground: Patients requiring renal biopsies have various glomerular diseases according to their demographic characteristics.
Objective: To study types of glomerular disease among adult Iraqi patients in a single center in Baghdad/Iraq
Material and Methods: A total of 120 native kidney biopsies were studied. All biopsies were adequate and were processed for Light Microscopy.
The age range of the study patients was 17-67 years, with a mean of 38.5 years. The mean follow up period was 28 weeks (4-52 weeks)
Indication for biopsy included: Nephrotic syndrome (N=72; 60%), Asymptomatic proteinuria (N=21; 17.5%), acute nephritic presentation (N=17; 14.16%), asymptomatic haematuria (N=10; 8.33%).
Results: Primary glomerulonephrit
Today with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned
Authentication is the process of determining whether someone or something is, in fact, who or what it is declared to be. As the dependence upon computers and computer networks grows, the need for user authentication has increased. User’s claimed identity can be verified by one of several methods. One of the most popular of these methods is represented by (something user know), such as password or Personal Identification Number (PIN). Biometrics is the science and technology of authentication by identifying the living individual’s physiological or behavioral attributes. Keystroke authentication is a new behavioral access control system to identify legitimate users via their typing behavior. The objective of this paper is to provide user
... Show MoreIris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the origin
... Show MoreThe quality of Global Navigation Satellite Systems (GNSS) networks are considerably influenced by the configuration of the observed baselines. Where, this study aims to find an optimal configuration for GNSS baselines in terms of the number and distribution of baselines to improve the quality criteria of the GNSS networks. First order design problem (FOD) was applied in this research to optimize GNSS network baselines configuration, and based on sequential adjustment method to solve its objective functions.
FOD for optimum precision (FOD-p) was the proposed model which based on the design criteria of A-optimality and E-optimality. These design criteria were selected as objective functions of precision, whic
... Show MoreSpeech is the essential way to interact between humans or between human and machine. However, it is always contaminated with different types of environment noise. Therefore, speech enhancement algorithms (SEA) have appeared as a significant approach in speech processing filed to suppress background noise and return back the original speech signal. In this paper, a new efficient two-stage SEA with low distortion is proposed based on minimum mean square error sense. The estimation of clean signal is performed by taking the advantages of Laplacian speech and noise modeling based on orthogonal transform (Discrete Krawtchouk-Tchebichef transform) coefficients distribution. The Discrete Kra