The electrocardiogram (ECG) is the recording of the electrical potential of the heart versus time. The analysis of ECG signals has been widely used in cardiac pathology to detect heart disease. The ECGs are non-stationary signals which are often contaminated by different types of noises from different sources. In this study, simulated noise models were proposed for the power-line interference (PLI), electromyogram (EMG) noise, base line wander (BW), white Gaussian noise (WGN) and composite noise. For suppressing noises and extracting the efficient morphology of an ECG signal, various processing techniques have been recently proposed. In this paper, wavelet transform (WT) is performed for noisy ECG signals. The graphical user interface (GUI)
... Show MoreIn information security, fingerprint verification is one of the most common recent approaches for verifying human identity through a distinctive pattern. The verification process works by comparing a pair of fingerprint templates and identifying the similarity/matching among them. Several research studies have utilized different techniques for the matching process such as fuzzy vault and image filtering approaches. Yet, these approaches are still suffering from the imprecise articulation of the biometrics’ interesting patterns. The emergence of deep learning architectures such as the Convolutional Neural Network (CNN) has been extensively used for image processing and object detection tasks and showed an outstanding performance compare
... Show MoreMy research to study the processes of the creation of shapes and encrypt any encryption in design forms and contents of computer technology as the creative property of definable and renewal, change and transformation process of transformative theme of shape, form and content encryption process in textile designs lets us know the meaning or substance which may be invisible to the encryption in the digital design of fabrics is a recruitment ideas modern and refined through a technique to accomplish the work of a beautiful audiences with novelty and innovation. The search includes four chapters:1Chapter I deal with the problem of research and its current research (form and content encryption with digital designs in women's contemporary fabr
... Show MoreData scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for
Introduction: Cutaneous leishmaniasis (CL) is a common protozoan disease in Iraq characterized by localized ulcers, primarily on exposed skin. This study aimed to investigate the hematological parameters of infected patients using a complete blood count (CBC) in the endemic area of Diyala Governorate, northeast of Baghdad. This has been studied in newly diagnosed, untreated individuals and patients receiving sodium antimony gluconate. Methodology: Hematological screening was performed on blood samples from 161 patients with microscopically diagnosed cutaneous leishmaniasis before and after treatment. Anti-Leishmania IgG was also assessed by ELISA in seropositive and seronegative subjects. Results: The newly diagnosed, untreated pati
... Show MoreModern statistical techniques offer a range of methodologies for modelling time series data, with conditional and unconditional approaches providing complementary insights that enhance overall model accuracy. This article introduced a modified ARIMA model employing conditional and unconditional parameter estimates. The methodology for the new model based on novel methods is provided. The prediction process, one and two steps ahead, is covered in detail, and a novel algorithm is presented. The best model is picked based on various measurement criteria, such as coefficient of determination (R2), root mean squared error (RMSE), and mean absolute scaled error (MASE). The suggested model is applied to a monthly petrol sales dataset (Jan
... Show MoreTransient mixed convection heat transfer in a confined porous medium heated at periodic sinusoidal heat flux is investigated numerically in the present paper. The Poisson-type pressure equation, resulted from the substituting of the momentum Darcy equation in the continuity equation, was discretized by using finite volume technique. The energy equation was solved by a fully implicit control volume-based finite difference formulation for the diffusion terms with the use of the quadratic upstream interpolation for convective kinetics scheme to discretize the convective terms and the temperature values at the control volume faces. The numerical study covers a range of the hydrostatic pressure head , , , , and ), sinusoidal amplitude range of
... Show MoreKE Sharquie, JR Al-Rawi, AA Noaimi, RA Al-Khammasi, Iraqi Journal of Community Medicine, 2018
Transient mixed convection heat transfer in a confined porous medium heated at periodic sinusoidal heat flux is investigated numerically in the present paper. The Poisson-type pressure equation, resulted from the substituting of the momentum Darcy equation in the continuity equation, was discretized by using finite volume technique. The energy equation was solved by a fully implicit control volume-based finite difference formulation for the diffusion terms with the use of the quadratic upstream interpolation for convective kinetics scheme to discretize the convective terms and the temperature values at the control volume faces. The numerical study covers a range of the hydrostatic pressure sinusoidal amplitude range and
... Show MoreThis research a study model of linear regression problem of autocorrelation of random error is spread when a normal distribution as used in linear regression analysis for relationship between variables and through this relationship can predict the value of a variable with the values of other variables, and was comparing methods (method of least squares, method of the average un-weighted, Thiel method and Laplace method) using the mean square error (MSE) boxes and simulation and the study included fore sizes of samples (15, 30, 60, 100). The results showed that the least-squares method is best, applying the fore methods of buckwheat production data and the cultivated area of the provinces of Iraq for years (2010), (2011), (2012),
... Show More