Classification of imbalanced data is an important issue. Many algorithms have been developed for classification, such as Back Propagation (BP) neural networks, decision tree, Bayesian networks etc., and have been used repeatedly in many fields. These algorithms speak of the problem of imbalanced data, where there are situations that belong to more classes than others. Imbalanced data result in poor performance and bias to a class without other classes. In this paper, we proposed three techniques based on the Over-Sampling (O.S.) technique for processing imbalanced dataset and redistributing it and converting it into balanced dataset. These techniques are (Improved Synthetic Minority Over-Sampling Technique (Improved SMOTE), Borderline-SMOTE + Imbalanced Ratio(IR), Adaptive Synthetic Sampling (ADASYN) +IR) Algorithm, where the work these techniques are generate the synthetic samples for the minority class to achieve balance between minority and majority classes and then calculate the IR between classes of minority and majority. Experimental results show ImprovedSMOTE algorithm outperform the Borderline-SMOTE + IR and ADASYN + IR algorithms because it achieves a high balance between minority and majority classes.
Listening comprehension of Iraqi EFL college students are not given time for practice, and incorporate in the programme of the Department of English, therefore, students are not well-prepared to comprehend the spoken language also the Iraqi EFL College students are deficient in comprehending the spoken English. So, listening strategies require a larger amount of consistent practice. The present study aims at finding out the effect of teaching the proposed listening strategies programme on EFL university students' listening comprehension. The sample consists of 104 of 1st year college students at the Department of English Language, College of Education Ibn-Rushed for Humanities. The programme deals with the following strategies: summrazing-n
... Show MoreThis paper studies a novel technique based on the use of two effective methods like modified Laplace- variational method (MLVIM) and a new Variational method (MVIM)to solve PDEs with variable coefficients. The current modification for the (MLVIM) is based on coupling of the Variational method (VIM) and Laplace- method (LT). In our proposal there is no need to calculate Lagrange multiplier. We applied Laplace method to the problem .Furthermore, the nonlinear terms for this problem is solved using homotopy method (HPM). Some examples are taken to compare results between two methods and to verify the reliability of our present methods.
In this research, the methods of Kernel estimator (nonparametric density estimator) were relied upon in estimating the two-response logistic regression, where the comparison was used between the method of Nadaraya-Watson and the method of Local Scoring algorithm, and optimal Smoothing parameter λ was estimated by the methods of Cross-validation and generalized Cross-validation, bandwidth optimal λ has a clear effect in the estimation process. It also has a key role in smoothing the curve as it approaches the real curve, and the goal of using the Kernel estimator is to modify the observations so that we can obtain estimators with characteristics close to the properties of real parameters, and based on medical data for patients with chro
... Show MoreIn this paper, the effective computational method (ECM) based on the standard monomial polynomial has been implemented to solve the nonlinear Jeffery-Hamel flow problem. Moreover, novel effective computational methods have been developed and suggested in this study by suitable base functions, namely Chebyshev, Bernstein, Legendre, and Hermite polynomials. The utilization of the base functions converts the nonlinear problem to a nonlinear algebraic system of equations, which is then resolved using the Mathematica®12 program. The development of effective computational methods (D-ECM) has been applied to solve the nonlinear Jeffery-Hamel flow problem, then a comparison between the methods has been shown. Furthermore, the maximum
... Show MoreUrbanization led to significant changes in the properties of the land surface. That appends additional heat loads at the city, which threaten comfort and health of people. There is unclear understanding represent of the relationship between climate indicators and the features of the early virtual urban design. The research focused on simulation capability, and the affect in urban microclimate. It is assumed that the adoption of certain scenarios and strategies to mitigate the intensity of the UHI leads to the improvement of the local climate and reduce the impact of global warming. The aim is to show on the UHI methods simulation and the programs that supporting simulation and mitigate the effect UHI. UHI reviewed has been conducted the for
... Show MoreIn this paper Volterra Runge-Kutta methods which include: method of order two and four will be applied to general nonlinear Volterra integral equations of the second kind. Moreover we study the convergent of the algorithms of Volterra Runge-Kutta methods. Finally, programs for each method are written in MATLAB language and a comparison between the two types has been made depending on the least square errors.
In this article, we will present a quasi-contraction mapping approach for D iteration, and we will prove that this iteration with modified SP iteration has the same convergence rate. At the other hand, we prove that the D iteration approach for quasi-contraction maps is faster than certain current leading iteration methods such as, Mann and Ishikawa. We are giving a numerical example, too.
In this paper was discussed the process of compounding two distributions using new compounding procedure which is connect a number of life time distributions ( continuous distribution ) where is the number of these distributions represent random variable distributed according to one of the discrete random distributions . Based on this procedure have been compounding zero – truncated poisson distribution with weibell distribution to produce new life time distribution having three parameter , Advantage of that failure rate function having many cases ( increasing , dicreasing , unimodal , bathtube) , and study the resulting distribution properties such as : expectation , variance , comulative function , reliability function and fa
... Show MoreThe experiment aimed to compare different methods of measuring the Feed pellet durability through the effect of pellet die speeds and the particle size (mill sieve holes diameter). Feed pellet durability was studied in four different ways: pellet direct measurement (%), pellet lengths (%), pellet water absorption (%), pellet durability by drop box device (%), pellet durability by air pressure device (%). Three pellet die speeds 280, 300, and 320 rpm, three mill sieve holes diameter 2, 4, and 6 mm, have been used. The results showed that increasing the pellet die speeds from 280 to 300 then to 320 rpm led to a significant decrease in the feed pellet durability by direct measurement, drop box device, and air pressure device, while pel
... Show More