Computer-aided diagnosis (CAD) has proved to be an effective and accurate method for diagnostic prediction over the years. This article focuses on the development of an automated CAD system with the intent to perform diagnosis as accurately as possible. Deep learning methods have been able to produce impressive results on medical image datasets. This study employs deep learning methods in conjunction with meta-heuristic algorithms and supervised machine-learning algorithms to perform an accurate diagnosis. Pre-trained convolutional neural networks (CNNs) or auto-encoder are used for feature extraction, whereas feature selection is performed using an ant colony optimization (ACO) algorithm. Ant colony optimization helps to search for the best optimal features while reducing the amount of data. Lastly, diagnosis prediction (classification) is achieved using learnable classifiers. The novel framework for the extraction and selection of features is based on deep learning, auto-encoder, and ACO. The performance of the proposed approach is evaluated using two medical image datasets: chest X-ray (CXR) and magnetic resonance imaging (MRI) for the prediction of the existence of COVID-19 and brain tumors. Accuracy is used as the main measure to compare the performance of the proposed approach with existing state-of-the-art methods. The proposed system achieves an average accuracy of 99.61% and 99.18%, outperforming all other methods in diagnosing the presence of COVID-19 and brain tumors, respectively. Based on the achieved results, it can be claimed that physicians or radiologists can confidently utilize the proposed approach for diagnosing COVID-19 patients and patients with specific brain tumors.
Evaluation of Dot. ELISA test for Diagnosis Visceral Leishmaniasis in Infected Children
Big data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show MoreThis research study Blur groups (Fuzzy Sets) which is the perception of the most modern in the application in various practical and theoretical areas and in various fields of life, was addressed to the fuzzy random variable whose value is not real, but the numbers Millbh because it expresses the mysterious phenomena or uncertain with measurements are not assertive. Fuzzy data were presented for binocular test and analysis of variance method of random Fuzzy variables , where this method depends on a number of assumptions, which is a problem that prevents the use of this method in the case of non-realized.
In this work ,medical zinc oxide was produced from zinc scraps instead of traditional method which used for medical applications such as skin diseases, Iraq is importing around 50 ton/year for samarra plant the producted powder has apartical size less than 5 micron and the purity was more than 99.98%,also apilot plant of yield capacitiy 15 kg/8hours wsa designed and manufactured .
In our work present, the application of strong-Lensing observations for some gravitational lenses have been adopted to study the geometry of the universe and to explain the physics and the size of the quasars. The first procedure was to study the geometrical of the Lensing system to determine the relation between the redshift of the gravitational observations with its distances. The second procedure was to compare between the angular diameter distances "DA" calculated from the Euclidean case with that from the Freedman models, then evaluating the diameter of the system lens. The results concluded that the phenomena are restricted to the ratio of distance between lens and source with the diameter of the lens noticing.
Copper Telluride Thin films of thickness 700nm and 900nm, prepared thin films using thermal evaporation on cleaned Si substrates kept at 300K under the vacuum about (4x10-5 ) mbar. The XRD analysis and (AFM) measurements use to study structure properties. The sensitivity (S) of the fabricated sensors to NO2 and H2 was measured at room temperature. The experimental relationship between S and thickness of the sensitive film was investigated, and higher S values were recorded for thicker sensors. Results showed that the best sensitivity was attributed to the Cu2Te film of 900 nm thickness at the H2 gas.