Support vector machine (SVM) is a popular supervised learning algorithm based on margin maximization. It has a high training cost and does not scale well to a large number of data points. We propose a multiresolution algorithm MRH-SVM that trains SVM on a hierarchical data aggregation structure, which also serves as a common data input to other learning algorithms. The proposed algorithm learns SVM models using high-level data aggregates and only visits data aggregates at more detailed levels where support vectors reside. In addition to performance improvements, the algorithm has advantages such as the ability to handle data streams and datasets with imbalanced classes. Experimental results show significant performance improvements in comparison with existing SVM algorithms.
Machine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To
... Show MoreThis study proposes a mathematical approach and numerical experiment for a simple solution of cardiac blood flow to the heart's blood vessels. A mathematical model of human blood flow through arterial branches was studied and calculated using the Navier-Stokes partial differential equation with finite element analysis (FEA) approach. Furthermore, FEA is applied to the steady flow of two-dimensional viscous liquids through different geometries. The validity of the computational method is determined by comparing numerical experiments with the results of the analysis of different functions. Numerical analysis showed that the highest blood flow velocity of 1.22 cm/s occurred in the center of the vessel which tends to be laminar and is influe
... Show MoreThrough this research, We have tried to evaluate the health programs and their effectiveness in improving the health situation through a study of the health institutions reality in Baghdad to identify the main reasons that affect the increase in maternal mortality by using two regression models, "Poisson's Regression Model" and "Hierarchical Poisson's Regression Model". And the study of that indicator (deaths) was through a comparison between the estimation methods of the used models. The "Maximum Likelihood" method was used to estimate the "Poisson's Regression Model"; whereas the "Full Maximum Likelihood" method were used for the "Hierarchical Poisson's Regression Model
... Show MoreIn this paper, we investigate the connection between the hierarchical models and the power prior distribution in quantile regression (QReg). Under specific quantile, we develop an expression for the power parameter ( ) to calibrate the power prior distribution for quantile regression to a corresponding hierarchical model. In addition, we estimate the relation between the and the quantile level via hierarchical model. Our proposed methodology is illustrated with real data example.
Diabetes is one of the increasing chronic diseases, affecting millions of people around the earth. Diabetes diagnosis, its prediction, proper cure, and management are compulsory. Machine learning-based prediction techniques for diabetes data analysis can help in the early detection and prediction of the disease and its consequences such as hypo/hyperglycemia. In this paper, we explored the diabetes dataset collected from the medical records of one thousand Iraqi patients. We applied three classifiers, the multilayer perceptron, the KNN and the Random Forest. We involved two experiments: the first experiment used all 12 features of the dataset. The Random Forest outperforms others with 98.8% accuracy. The second experiment used only five att
... Show MoreA common approach to the color image compression was started by transform
the red, green, and blue or (RGB) color model to a desire color model, then applying
compression techniques, and finally retransform the results into RGB model In this
paper, a new color image compression method based on multilevel block truncation
coding (MBTC) and vector quantization is presented. By exploiting human visual
system response for color, bit allocation process is implemented to distribute the bits
for encoding in more effective away.
To improve the performance efficiency of vector quantization (VQ),
modifications have been implemented. To combines the simple computational and
edge preservation properties of MBTC with high c
The study aims at showing the active role of the internal auditors through explaining what they should be obliged to in writing the reports and financial and non financial statements according to the international standards of accounting to be transparent and integral. It also aims at giving the independence that the auditors should enjoy through connecting them to an Auditing Commissions to submit additional services in addition to assessing the instrument of control to evaluate risks, give consultations and the services related to the governance and independence of Supervising Council.
... Show MoreThis paper deals with the nonlinear large-angle bending dynamic analysis of curved beams which investigated by modeling wave’s transmission along curved members. The approach depends on the wave propagation in one-dimensional structural element using the method of characteristics. The method of characteristics (MOC) is found to be a suitable method for idealizing the wave propagation inside structural systems. Timoshenko’s beam theory, which includes transverse shear deformation and rotary inertia effects, is adopted in the analysis. Only geometrical non-linearity is considered in this study and the material is assumed to be linearly elastic. Different boundary conditions and loading cases are examined.
From the results obtai
... Show MoreThe precise classification of DNA sequences is pivotal in genomics, holding significant implications for personalized medicine. The stakes are particularly high when classifying key genetic markers such as BRAC, related to breast cancer susceptibility; BRAF, associated with various malignancies; and KRAS, a recognized oncogene. Conventional machine learning techniques often necessitate intricate feature engineering and may not capture the full spectrum of sequence dependencies. To ameliorate these limitations, this study employs an adapted UNet architecture, originally designed for biomedical image segmentation, to classify DNA sequences.The attention mechanism was also tested LONG WITH u-Net architecture to precisely classify DNA sequences
... Show More