The dynamic development of computer and software technology in recent years was accompanied by the expansion and widespread implementation of artificial intelligence (AI) based methods in many aspects of human life. A prominent field where rapid progress was observed are high‐throughput methods in biology that generate big amounts of data that need to be processed and analyzed. Therefore, AI methods are more and more applied in the biomedical field, among others for RNA‐protein binding sites prediction, DNA sequence function prediction, protein‐protein interaction prediction, or biomedical image classification. Stem cells are widely used in biomedical research, e.g., leukemia or other disease studies. Our proposed approach of Deep Bayesian Neural Network (DBNN) for the personalized treatment of leukemia cancer has shown a significant tested accuracy for the model. DBNNs used in this study was able to classify images with accuracy exceeding 98.73%. This study depicts that the DBNN can classify cell cultures only based on unstained light microscope images which allow their further use. Therefore, building a bayesian‐based model to great help during commercial cell culturing, and possibly a first step in the process of creating an automated/semiautomated neural network‐based model for classification of good and bad quality cultures when images of such will be available.
In many scientific fields, Bayesian models are commonly used in recent research. This research presents a new Bayesian model for estimating parameters and forecasting using the Gibbs sampler algorithm. Posterior distributions are generated using the inverse gamma distribution and the multivariate normal distribution as prior distributions. The new method was used to investigate and summaries Bayesian statistics' posterior distribution. The theory and derivation of the posterior distribution are explained in detail in this paper. The proposed approach is applied to three simulation datasets of 100, 300, and 500 sample sizes. Also, the procedure was extended to the real dataset called the rock intensity dataset. The actual dataset is collecte
... Show MoreObjective(s): To determine the impact of Chemotherapy upon the quality of life for patients with chronic myeloid
leukemia in Baghdad city.
Methodology: A descriptive study design was carried out The study was initiated from 30 January 2011 to October
2011.A purposive (non–probability) sample consisted of (130) patients with a chronic myeloid leukemia ,Who
attended to Baghdad Teaching Hospital and National Center for Research and Treatment of Hematology. The
sample criteria was the patients who were 18 years old and above, excluding the patients who suffered from
psychological problems and other chronic illnesses .A questionnaire was adopted and developed from European
Organization Research and treatment of Can
Acute lymphoblastic leukemia (ALL) is one of the most common diseases , so in this study the serum level of malondialdehyde and its relationship with metanephrine was investigated in acute lymphoblastic leukemia patients over one month of treatment. Some biochemical parameters (serum glucose , total serum protein , malondialdehyde ,vitamin C, and metanephrine) changed as well as white blood cell count and blood hemoglobinlevelswere analyzed in sixty patients diagnosed with acute lymphoblastic leukemia over one month of treatment compared to healthy control group.Statistically significant increases (p<0.01) in white blood cell (WBC) count, mean concentrations of malondialdehyde (MDA) (p< 0.05) and metanephrine (p< 0.001) were observed in
... Show MoreMetaheuristics under the swarm intelligence (SI) class have proven to be efficient and have become popular methods for solving different optimization problems. Based on the usage of memory, metaheuristics can be classified into algorithms with memory and without memory (memory-less). The absence of memory in some metaheuristics will lead to the loss of the information gained in previous iterations. The metaheuristics tend to divert from promising areas of solutions search spaces which will lead to non-optimal solutions. This paper aims to review memory usage and its effect on the performance of the main SI-based metaheuristics. Investigation has been performed on SI metaheuristics, memory usage and memory-less metaheuristics, memory char
... Show MoreDetection of early clinical keratoconus (KCN) is a challenging task, even for expert clinicians. In this study, we propose a deep learning (DL) model to address this challenge. We first used Xception and InceptionResNetV2 DL architectures to extract features from three different corneal maps collected from 1371 eyes examined in an eye clinic in Egypt. We then fused features using Xception and InceptionResNetV2 to detect subclinical forms of KCN more accurately and robustly. We obtained an area under the receiver operating characteristic curves (AUC) of 0.99 and an accuracy range of 97–100% to distinguish normal eyes from eyes with subclinical and established KCN. We further validated the model based on an independent dataset with
... Show MoreThe method of predicting the electricity load of a home using deep learning techniques is called intelligent home load prediction based on deep convolutional neural networks. This method uses convolutional neural networks to analyze data from various sources such as weather, time of day, and other factors to accurately predict the electricity load of a home. The purpose of this method is to help optimize energy usage and reduce energy costs. The article proposes a deep learning-based approach for nonpermanent residential electrical ener-gy load forecasting that employs temporal convolutional networks (TCN) to model historic load collection with timeseries traits and to study notably dynamic patterns of variants amongst attribute par
... Show More