Dapagliflozin is a novel sodium-glucose cotransporter type 2 inhibitor. This work aims to develop a new
validated sensitive RP-HPLC coupled with a mass detector method for the determination of dapagliflozin, its
alpha isomer, and starting material in the presence of dapagliflozin major degradation products and an internal
standard (empagliflozin). The separation was achieved on BDS Hypersil column (length of 250mm, internal
diameter of 4.6 mm and 5-μm particle size) at a temperature of 35℃. Water and acetonitrile were used as
mobile phase A and B by gradient mode at a flow rate of 1 mL/min. A wavelength of 224nm was selected to
perform detection using a photo diode array detector. The method met the requirement of the International
Conference on Harmonisation for Registration of Pharmaceuticals for Human Use (ICH) for validation. The
molecular weight of impurities and degradation products was estimated using positive ESI-MS. Fifteen
impurities were detected during the analysis of dapagliflozin APIs and the brand Farxiga ® and some generic
products. Three of fifteen detected impurities (H, J and K) exceeded the impurities acceptable limits 0.1%.
Those impurities were isolated using new preparative chromatography then characterized using elemental
analysis, FTIR and NMR.
Text based-image clustering (TBIC) is an insufficient approach for clustering related web images. It is a challenging task to abstract the visual features of images with the support of textual information in a database. In content-based image clustering (CBIC), image data are clustered on the foundation of specific features like texture, colors, boundaries, shapes. In this paper, an effective CBIC) technique is presented, which uses texture and statistical features of the images. The statistical features or moments of colors (mean, skewness, standard deviation, kurtosis, and variance) are extracted from the images. These features are collected in a one dimension array, and then genetic algorithm (GA) is applied for image clustering.
... Show MoreWith the escalation of cybercriminal activities, the demand for forensic investigations into these crimeshas grown significantly. However, the concept of systematic pre-preparation for potential forensicexaminations during the software design phase, known as forensic readiness, has only recently gainedattention. Against the backdrop of surging urban crime rates, this study aims to conduct a rigorous andprecise analysis and forecast of crime rates in Los Angeles, employing advanced Artificial Intelligence(AI) technologies. This research amalgamates diverse datasets encompassing crime history, varioussocio-economic indicators, and geographical locations to attain a comprehensive understanding of howcrimes manifest within the city. Lev
... Show MoreNumeral recognition is considered an essential preliminary step for optical character recognition, document understanding, and others. Although several handwritten numeral recognition algorithms have been proposed so far, achieving adequate recognition accuracy and execution time remain challenging to date. In particular, recognition accuracy depends on the features extraction mechanism. As such, a fast and robust numeral recognition method is essential, which meets the desired accuracy by extracting the features efficiently while maintaining fast implementation time. Furthermore, to date most of the existing studies are focused on evaluating their methods based on clean environments, thus limiting understanding of their potential a
... Show MoreIt is the regression analysis is the foundation stone of knowledge of statistics , which mostly depends on the ordinary least square method , but as is well known that the way the above mentioned her several conditions to operate accurately and the results can be unreliable , add to that the lack of certain conditions make it impossible to complete the work and analysis method and among those conditions are the multi-co linearity problem , and we are in the process of detected that problem between the independent variables using farrar –glauber test , in addition to the requirement linearity data and the lack of the condition last has been resorting to the
... Show MoreIn present work examined the oxidation desulfurization in batch system for model fuels with 2250 ppm sulfur content using air as the oxidant and ZnO/AC composite prepared by thermal co-precipitation method. Different factors were studied such as composite loading 1, 1.5 and 2.5 g, temperature 25 oC, 30 oC and 40 oC and reaction time 30, 45 and 60 minutes. The optimum condition is obtained by using Tauguchi experiential design for oxidation desulfurization of model fuel. the highest percent sulfur removal is about 33 at optimum conditions. The kinetic and effect of internal mass transfer were studied for oxidation desulfurization of model fuel, also an empirical kinetic model was calculated for model fuels
... Show MoreThis review investigates the practice and influence of chatbots and ChatGPT as employable tools in writing for scientific academic purposes. A primary collection of 150 articles was gathered from academic databases, but it was systematically chosen and refined to include 30 studies that focused on the use of ChatGPT and chatbot technology in academic writing contexts. Chatbots and ChatGPT in writing enhancement, support for student learning at higher education institutions, scientific and medical writing, and the evolution of research and academic publishing are some of the topics covered in the reviewed literature. The review finds these tools helpful, with their greatest advantages being in areas such as structuring writings, gram
... Show MoreThe denoising of a natural image corrupted by Gaussian noise is a problem in signal or image processing. Much work has been done in the field of wavelet thresholding but most of it was focused on statistical modeling of wavelet coefficients and the optimal choice of thresholds. This paper describes a new method for the suppression of noise in image by fusing the stationary wavelet denoising technique with adaptive wiener filter. The wiener filter is applied to the reconstructed image for the approximation coefficients only, while the thresholding technique is applied to the details coefficients of the transform, then get the final denoised image is obtained by combining the two results. The proposed method was applied by usin
... Show MoreHome New Trends in Information and Communications Technology Applications Conference paper Audio Compression Using Transform Coding with LZW and Double Shift Coding Zainab J. Ahmed & Loay E. George Conference paper First Online: 11 January 2022 126 Accesses Part of the Communications in Computer and Information Science book series (CCIS,volume 1511) Abstract The need for audio compression is still a vital issue, because of its significance in reducing the data size of one of the most common digital media that is exchanged between distant parties. In this paper, the efficiencies of two audio compression modules were investigated; the first module is based on discrete cosine transform and the second module is based on discrete wavelet tr
... Show MoreDiamond-like carbon, amorphous hydrogenated films forms of carbon, were pretreated from cyclohexane (C6H12) liquid using plasma jet which operates with alternating voltage 7.5kv and frequency 28kHz. The plasma Separates molecules of cyclohexane and Transform it into carbon nanoparticles. The effect of argon flow rate (0.5, 1 and 1.5 L/min) on the optical and chemical bonding properties of the films were investigated. These films were characterized by UV-Visible spectrophotometer, X-ray diffractometer (XRD) Raman spectroscopy and scanning electron microscopy (SEM). The main absorption appears around 296, 299 and 309nm at the three flow rate of argon gas. The value of the optical energy gap is 3.37, 3.55 and 3.68 eV at a different flow rate o
... Show More