In this paper we present the first ever measured experimental electron momentum density of Cu2Sb at an intermediate resolution (0.6 a.u.) using 59.54 keV 241Am Compton spectrometer. The measurements are compared with the theoretical Compton profiles using density function theory (DFT) within a linear combination of an atomic orbitals (LCAO) method. In DFT calculation, Perdew-Burke-Ernzerhof (PBE) scheme is employed to treat correlation whereas exchange is included by following the Becke scheme. It is seen that various approximations within LCAO-DFT show relatively better agreement with the experimental Compton data. Ionic model calculations for a number of configurations (Cu+x/2)2(Sb-x) (0.0≤x≤2.0) are also performed utilizing free atom profiles, the ionic model suggests transfer of 2.0 electrons per Cu atom from 4s state to 5p state of Sb.
In this paper, new brain tumour detection method is discovered whereby the normal slices are disassembled from the abnormal ones. Three main phases are deployed including the extraction of the cerebral tissue, the detection of abnormal block and the mechanism of fine-tuning and finally the detection of abnormal slice according to the detected abnormal blocks. Through experimental tests, progress made by the suggested means is assessed and verified. As a result, in terms of qualitative assessment, it is found that the performance of proposed method is satisfactory and may contribute to the development of reliable MRI brain tumour diagnosis and treatments.
Data mining has the most important role in healthcare for discovering hidden relationships in big datasets, especially in breast cancer diagnostics, which is the most popular cause of death in the world. In this paper two algorithms are applied that are decision tree and K-Nearest Neighbour for diagnosing Breast Cancer Grad in order to reduce its risk on patients. In decision tree with feature selection, the Gini index gives an accuracy of %87.83, while with entropy, the feature selection gives an accuracy of %86.77. In both cases, Age appeared as the most effective parameter, particularly when Age<49.5. Whereas Ki67 appeared as a second effective parameter. Furthermore, K- Nearest Neighbor is based on the minimu
... Show MoreIn digital images, protecting sensitive visual information against unauthorized access is considered a critical issue; robust encryption methods are the best solution to preserve such information. This paper introduces a model designed to enhance the performance of the Tiny Encryption Algorithm (TEA) in encrypting images. Two approaches have been suggested for the image cipher process as a preprocessing step before applying the Tiny Encryption Algorithm (TEA). The step mentioned earlier aims to de-correlate and weaken adjacent pixel values as a preparation process before the encryption process. The first approach suggests an Affine transformation for image encryption at two layers, utilizing two different key sets for each layer. Th
... Show MoreWith the escalation of cybercriminal activities, the demand for forensic investigations into these crimeshas grown significantly. However, the concept of systematic pre-preparation for potential forensicexaminations during the software design phase, known as forensic readiness, has only recently gainedattention. Against the backdrop of surging urban crime rates, this study aims to conduct a rigorous andprecise analysis and forecast of crime rates in Los Angeles, employing advanced Artificial Intelligence(AI) technologies. This research amalgamates diverse datasets encompassing crime history, varioussocio-economic indicators, and geographical locations to attain a comprehensive understanding of howcrimes manifest within the city. Lev
... Show MoreIn many industries especially oil companies in Iraq consumed large quantities of water which will produce oil-contaminated water which can cause major pollution in agricultural lands and rivers. The aim of the present work is to enhance the efficiency of dispersed air flotation technique by using highly effective and cost-efficient coagulant to treating gas oil emulsion. The experimental work was carried out using bubble column made of Perspex glass (5cm I.D, 120cm height). A liquid was at depth of 60cm. Different dosage of sawdust +bentonite at ratio 2:1 (0.5+ 0.25; 1+ 0.5 and 2+1) gm and alum at concentration (10,20and30mg/l) at different pH ( 4 and 7) were used to determine optimum dosages of coagulant. Jar test exper
... Show MoreThe necessities of steganography methods for hiding secret message into images have been ascend. Thereby, this study is to generate a practical steganography procedure to hide text into image. This operation allows the user to provide the system with both text and cover image, and to find a resulting image that comprises the hidden text inside. The suggested technique is to hide a text inside the header formats of a digital image. Least Significant Bit (LSB) method to hide the message or text, in order to keep the features and characteristics of the original image are used. A new method is applied via using the whole image (header formats) to hide the image. From the experimental results, suggested technique that gives a higher embe
... Show MoreUsing remote sensing technology and modeling methodologies to monitor changes in land surface temperature (LST) and urban heat islands (UHI) has become an essential reference for making decisions on sustainable land use. This study estimates LST and UHI in Salah al-din Province to contribute to land management, Urban planning, or climate resilience in the region; as a result of environmental changes in recent years, LANDSAT Satellite Imagery from 2014- 2024 was implemented to estimate the LST and UHI indexes in Salah al-din Province, ArcGIS 10.7 was use to calculate the indices, and The normalized mean vegetation index (NDVI) was calculated as it is closely related to extracting (LST
Numeral recognition is considered an essential preliminary step for optical character recognition, document understanding, and others. Although several handwritten numeral recognition algorithms have been proposed so far, achieving adequate recognition accuracy and execution time remain challenging to date. In particular, recognition accuracy depends on the features extraction mechanism. As such, a fast and robust numeral recognition method is essential, which meets the desired accuracy by extracting the features efficiently while maintaining fast implementation time. Furthermore, to date most of the existing studies are focused on evaluating their methods based on clean environments, thus limiting understanding of their potential a
... Show More