Abstract: Background: Optical biosensors offer excellent properties and methods for detecting bacteria when compared to traditional analytical techniques. It allows direct detection of many biological and chemical materials. Bacteria are found in the human body naturally non-pathogenic and pathologically, as they are found in other living organisms. One of these bacteria is Escherichia coli (E. coli) which are found in the human body in its natural and pathogenic form. E.coli bacteria cause many diseases, including Stomach, intestines, urinary system infections, and others. The aim of this study: is sensing and differentiation between normal flora and pathogenic E.coli. Material and method: The optical biosensor constructed of a multi-mode – no core- multi mode optical fibre that differentiates between pathogenic and non-pathogenic bacteria of E.coli by measuring the changing for light intensity using source of light 410nm laser diode. Multi-mode - no core - multi-mode optical fibre (MM-NOC-MM) connected to the OSA analyser (HR2000) by means of an adapter and finally connected to a computer to show the results. Results: The intensity of the transmitted light recorded in the case of pathogenic bacteria is less than the intensity of the transmitted light recorded in the case of non-pathogenic bacteria. Conclusion: these results were obtained because of the ideal and better choice of the wavelength of the laser used with its absorption E.coli bacteria.
With the escalation of cybercriminal activities, the demand for forensic investigations into these crimeshas grown significantly. However, the concept of systematic pre-preparation for potential forensicexaminations during the software design phase, known as forensic readiness, has only recently gainedattention. Against the backdrop of surging urban crime rates, this study aims to conduct a rigorous andprecise analysis and forecast of crime rates in Los Angeles, employing advanced Artificial Intelligence(AI) technologies. This research amalgamates diverse datasets encompassing crime history, varioussocio-economic indicators, and geographical locations to attain a comprehensive understanding of howcrimes manifest within the city. Lev
... Show MoreNumeral recognition is considered an essential preliminary step for optical character recognition, document understanding, and others. Although several handwritten numeral recognition algorithms have been proposed so far, achieving adequate recognition accuracy and execution time remain challenging to date. In particular, recognition accuracy depends on the features extraction mechanism. As such, a fast and robust numeral recognition method is essential, which meets the desired accuracy by extracting the features efficiently while maintaining fast implementation time. Furthermore, to date most of the existing studies are focused on evaluating their methods based on clean environments, thus limiting understanding of their potential a
... Show MoreUsing the Internet, nothing is secure and as we are in need of means of protecting our data, the use of passwords has become important in the electronic world. To ensure that there is no hacking and to protect the database that contains important information such as the ID card and banking information, the proposed system stores the username after hashing it using the 256 hash algorithm and strong passwords are saved to repel attackers using one of two methods: -The first method is to add a random salt to the password using the CSPRNG algorithm, then hash it using hash 256 and store it on the website. -The second method is to use the PBKDF2 algorithm, which salts the passwords and extends them (deriving the password) before being ha
... Show MoreIt is the regression analysis is the foundation stone of knowledge of statistics , which mostly depends on the ordinary least square method , but as is well known that the way the above mentioned her several conditions to operate accurately and the results can be unreliable , add to that the lack of certain conditions make it impossible to complete the work and analysis method and among those conditions are the multi-co linearity problem , and we are in the process of detected that problem between the independent variables using farrar –glauber test , in addition to the requirement linearity data and the lack of the condition last has been resorting to the
... Show MoreIn present work examined the oxidation desulfurization in batch system for model fuels with 2250 ppm sulfur content using air as the oxidant and ZnO/AC composite prepared by thermal co-precipitation method. Different factors were studied such as composite loading 1, 1.5 and 2.5 g, temperature 25 oC, 30 oC and 40 oC and reaction time 30, 45 and 60 minutes. The optimum condition is obtained by using Tauguchi experiential design for oxidation desulfurization of model fuel. the highest percent sulfur removal is about 33 at optimum conditions. The kinetic and effect of internal mass transfer were studied for oxidation desulfurization of model fuel, also an empirical kinetic model was calculated for model fuels
... Show MoreThis review investigates the practice and influence of chatbots and ChatGPT as employable tools in writing for scientific academic purposes. A primary collection of 150 articles was gathered from academic databases, but it was systematically chosen and refined to include 30 studies that focused on the use of ChatGPT and chatbot technology in academic writing contexts. Chatbots and ChatGPT in writing enhancement, support for student learning at higher education institutions, scientific and medical writing, and the evolution of research and academic publishing are some of the topics covered in the reviewed literature. The review finds these tools helpful, with their greatest advantages being in areas such as structuring writings, gram
... Show MoreThe denoising of a natural image corrupted by Gaussian noise is a problem in signal or image processing. Much work has been done in the field of wavelet thresholding but most of it was focused on statistical modeling of wavelet coefficients and the optimal choice of thresholds. This paper describes a new method for the suppression of noise in image by fusing the stationary wavelet denoising technique with adaptive wiener filter. The wiener filter is applied to the reconstructed image for the approximation coefficients only, while the thresholding technique is applied to the details coefficients of the transform, then get the final denoised image is obtained by combining the two results. The proposed method was applied by usin
... Show MoreHome New Trends in Information and Communications Technology Applications Conference paper Audio Compression Using Transform Coding with LZW and Double Shift Coding Zainab J. Ahmed & Loay E. George Conference paper First Online: 11 January 2022 126 Accesses Part of the Communications in Computer and Information Science book series (CCIS,volume 1511) Abstract The need for audio compression is still a vital issue, because of its significance in reducing the data size of one of the most common digital media that is exchanged between distant parties. In this paper, the efficiencies of two audio compression modules were investigated; the first module is based on discrete cosine transform and the second module is based on discrete wavelet tr
... Show MoreDiamond-like carbon, amorphous hydrogenated films forms of carbon, were pretreated from cyclohexane (C6H12) liquid using plasma jet which operates with alternating voltage 7.5kv and frequency 28kHz. The plasma Separates molecules of cyclohexane and Transform it into carbon nanoparticles. The effect of argon flow rate (0.5, 1 and 1.5 L/min) on the optical and chemical bonding properties of the films were investigated. These films were characterized by UV-Visible spectrophotometer, X-ray diffractometer (XRD) Raman spectroscopy and scanning electron microscopy (SEM). The main absorption appears around 296, 299 and 309nm at the three flow rate of argon gas. The value of the optical energy gap is 3.37, 3.55 and 3.68 eV at a different flow rate o
... Show More