Continuous turbidimetric analysis (CTA) for a distinctive analytical application by employing a homemade analyser (NAG Dual & Solo 0-180°) which contained two consecutive detection zones (measuring cells 1 & 2) is described. The analyser works based on light-emitting diodes as a light source and a set of solar cells as a light detector for turbidity measurements without needing further fibres or lenses. Formation of a turbid precipitated product with yellow colour due to the reaction between the warfarin and the precipitation reagent (Potassium dichromate) is what the developed method is based on. The CTA method was applied to determine the warfarin in pure form and pharmaceutical formulations in the concentration range from 2.0-16& 0.7-16 mmol/L with 0.58 and 0.55 mmol/L of the limit of detections. The correlation coefficients (r) of the developed method were 0.9977 and 0.9981 for cell 1 and 2 respectively. For validation of proposed method, the ICH guidelines were followed. The developed method was successfully applied for the determination of Warfarin in pure and pharmaceutical preparations. In addition, the method can be considered as a quality control method and conveniently used for routine analysis in laboratories since the method permits quantitatively determination of 60 samples/h.
Medical images play a crucial role in the classification of various diseases and conditions. One of the imaging modalities is X-rays which provide valuable visual information that helps in the identification and characterization of various medical conditions. Chest radiograph (CXR) images have long been used to examine and monitor numerous lung disorders, such as tuberculosis, pneumonia, atelectasis, and hernia. COVID-19 detection can be accomplished using CXR images as well. COVID-19, a virus that causes infections in the lungs and the airways of the upper respiratory tract, was first discovered in 2019 in Wuhan Province, China, and has since been thought to cause substantial airway damage, badly impacting the lungs of affected persons.
... Show Moresummary
In this search, we examined the factorial experiments and the study of the significance of the main effects, the interaction of the factors and their simple effects by the F test (ANOVA) for analyze the data of the factorial experience. It is also known that the analysis of variance requires several assumptions to achieve them, Therefore, in case of violation of one of these conditions we conduct a transform to the data in order to match or achieve the conditions of analysis of variance, but it was noted that these transfers do not produce accurate results, so we resort to tests or non-parametric methods that work as a solution or alternative to the parametric tests , these method
... Show MoreCryptography is a method used to mask text based on any encryption method, and the authorized user only can decrypt and read this message. An intruder tried to attack in many manners to access the communication channel, like impersonating, non-repudiation, denial of services, modification of data, threatening confidentiality and breaking availability of services. The high electronic communications between people need to ensure that transactions remain confidential. Cryptography methods give the best solution to this problem. This paper proposed a new cryptography method based on Arabic words; this method is done based on two steps. Where the first step is binary encoding generation used t
... Show MoreA novel median filter based on crow optimization algorithms (OMF) is suggested to reduce the random salt and pepper noise and improve the quality of the RGB-colored and gray images. The fundamental idea of the approach is that first, the crow optimization algorithm detects noise pixels, and that replacing them with an optimum median value depending on a criterion of maximization fitness function. Finally, the standard measure peak signal-to-noise ratio (PSNR), Structural Similarity, absolute square error and mean square error have been used to test the performance of suggested filters (original and improved median filter) used to removed noise from images. It achieves the simulation based on MATLAB R2019b and the resul
... Show MoreAuthentication is the process of determining whether someone or something is, in fact, who or what it is declared to be. As the dependence upon computers and computer networks grows, the need for user authentication has increased. User’s claimed identity can be verified by one of several methods. One of the most popular of these methods is represented by (something user know), such as password or Personal Identification Number (PIN). Biometrics is the science and technology of authentication by identifying the living individual’s physiological or behavioral attributes. Keystroke authentication is a new behavioral access control system to identify legitimate users via their typing behavior. The objective of this paper is to provide user
... Show MoreThe quality of Global Navigation Satellite Systems (GNSS) networks are considerably influenced by the configuration of the observed baselines. Where, this study aims to find an optimal configuration for GNSS baselines in terms of the number and distribution of baselines to improve the quality criteria of the GNSS networks. First order design problem (FOD) was applied in this research to optimize GNSS network baselines configuration, and based on sequential adjustment method to solve its objective functions.
FOD for optimum precision (FOD-p) was the proposed model which based on the design criteria of A-optimality and E-optimality. These design criteria were selected as objective functions of precision, whic
... Show MoreToday with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned