The development of analytical techniques is required for the accurate and comprehensive detection and measurement of antibiotic contamination in the environment. Metronidazole is a common antibacterial, antiprotozoal, and antibiotic drug. Thiamine is a vital biological and medicinal ingredient that is involved in the metabolism of proteins, fats, and carbohydrates that produce energy. The study aims to identify the drugs in a mixture without separation to provide more information to confirm if a drug is present in a combination. Metronidazole and thiamine are two examples of pharmaceutical and environmental samples that can be identified using spectrophotometric techniques because of their low cost and simplicity of use. The operating solutions of both drugs were used to scan the UV spectrum between 200 and 400 nm, and the corresponding overlay spectra of the two drugs were recorded. The best absorption peak at 264 nm is exhibited by thiamine, which is observed in the presence of metronidazole with the peak at 320 nm in the suggested method. The working curve for metronidazole and thiamine was set to obey the Beer – Lambert law in the range of (1 – 15) μg/mL for each of them, with molar absorptivity values of 12716.89 and 2053.22 liter/mol/cm for metronidazole and thiamine, respectively. Thiamine and metronidazole showed results in the recovery study that ranged from 99.1 to 100% across three concentrations. For thiamine and metronidazole, the precision study was conducted using estimates of
In this study, a fast block matching search algorithm based on blocks' descriptors and multilevel blocks filtering is introduced. The used descriptors are the mean and a set of centralized low order moments. Hierarchal filtering and MAE similarity measure were adopted to nominate the best similar blocks lay within the pool of neighbor blocks. As next step to blocks nomination the similarity of the mean and moments is used to classify the nominated blocks and put them in one of three sub-pools, each one represents certain nomination priority level (i.e., most, less & least level). The main reason of the introducing nomination and classification steps is a significant reduction in the number of matching instances of the pixels belong to the c
... Show MoreThe aim of this research is to compare traditional and modern methods to obtain the optimal solution using dynamic programming and intelligent algorithms to solve the problems of project management.
It shows the possible ways in which these problems can be addressed, drawing on a schedule of interrelated and sequential activities And clarifies the relationships between the activities to determine the beginning and end of each activity and determine the duration and cost of the total project and estimate the times used by each activity and determine the objectives sought by the project through planning, implementation and monitoring to maintain the budget assessed
... Show MoreTo maintain the security and integrity of data, with the growth of the Internet and the increasing prevalence of transmission channels, it is necessary to strengthen security and develop several algorithms. The substitution scheme is the Playfair cipher. The traditional Playfair scheme uses a small 5*5 matrix containing only uppercase letters, making it vulnerable to hackers and cryptanalysis. In this study, a new encryption and decryption approach is proposed to enhance the resistance of the Playfair cipher. For this purpose, the development of symmetric cryptography based on shared secrets is desired. The proposed Playfair method uses a 5*5 keyword matrix for English and a 6*6 keyword matrix for Arabic to encrypt the alphabets of
... Show MoreA new algorithm is proposed to compress speech signals using wavelet transform and linear predictive coding. Signal compression based on the concept of selecting a small number of approximation coefficients after they are compressed by the wavelet decomposition (Haar and db4) at a suitable chosen level and ignored details coefficients, and then approximation coefficients are windowed by a rectangular window and fed to the linear predictor. Levinson Durbin algorithm is used to compute LP coefficients, reflection coefficients and predictor error. The compress files contain LP coefficients and previous sample. These files are very small in size compared to the size of the original signals. Compression ratio is calculated from the size of th
... Show MoreThis work deals with the separation of benzene and toluene from a BTX fraction. The separation was carried out using adsorption by molecular sieve zeolite 13X in a fixed bed. The concentration of benzene and toluene in the influent streams was measured using gas chromatography. The effect of flow rate in the range 0.77 – 2.0 cm3/min on the benzene and toluene extraction from BTX fraction was studied. The flow rate increasing decreases the breakthrough and saturation times. The effect of bed height in the range 31.6 – 63.3 cm on benzene and toluene adsorption from BTX fraction was studied. The increase of bed height increasing increases the break point values. The effect of the concentration of benzene in the range 0.0559 – 0.2625g/
... Show MoreStructure of unstable 21,23,25,26F nuclei have been investigated
using Hartree – Fock (HF) and shell model calculations. The ground
state proton, neutron and matter density distributions, root mean
square (rms) radii and neutron skin thickness of these isotopes are
studied. Shell model calculations are performed using SDBA
interaction. In HF method the selected effective nuclear interactions,
namely the Skyrme parameterizations SLy4, Skeσ, SkBsk9 and
Skxs25 are used. Also, the elastic electron scattering form factors of
these isotopes are studied. The calculated form factors in HF
calculations show many diffraction minima in contrary to shell
model, which predicts less diffraction minima. The long tail