In this paper, a new technique is offered for solving three types of linear integral equations of the 2nd kind including Volterra-Fredholm integral equations (LVFIE) (as a general case), Volterra integral equations (LVIE) and Fredholm integral equations (LFIE) (as special cases). The new technique depends on approximating the solution to a polynomial of degree and therefore reducing the problem to a linear programming problem(LPP), which will be solved to find the approximate solution of LVFIE. Moreover, quadrature methods including trapezoidal rule (TR), Simpson 1/3 rule (SR), Boole rule (BR), and Romberg integration formula (RI) are used to approximate the integrals that exist in LVFIE. Also, a comparison between those methods is produced. Finally, for more explanation, an algorithm is proposed and applied for testing examples to illustrate the effectiveness of the new technique.
It is well-known that the existence of outliers in the data will adversely affect the efficiency of estimation and results of the current study. In this paper four methods will be studied to detect outliers for the multiple linear regression model in two cases : first, in real data; and secondly, after adding the outliers to data and the attempt to detect it. The study is conducted for samples with different sizes, and uses three measures for comparing between these methods . These three measures are : the mask, dumping and standard error of the estimate.
The support vector machine, also known as SVM, is a type of supervised learning model that can be used for classification or regression depending on the datasets. SVM is used to classify data points by determining the best hyperplane between two or more groups. Working with enormous datasets, on the other hand, might result in a variety of issues, including inefficient accuracy and time-consuming. SVM was updated in this research by applying some non-linear kernel transformations, which are: linear, polynomial, radial basis, and multi-layer kernels. The non-linear SVM classification model was illustrated and summarized in an algorithm using kernel tricks. The proposed method was examined using three simulation datasets with different sample
... Show MoreAbstract
Characterized by the Ordinary Least Squares (OLS) on Maximum Likelihood for the greatest possible way that the exact moments are known , which means that it can be found, while the other method they are unknown, but approximations to their biases correct to 0(n-1) can be obtained by standard methods. In our research expressions for approximations to the biases of the ML estimators (the regression coefficients and scale parameter) for linear (type 1) Extreme Value Regression Model for Largest Values are presented by using the advanced approach depends on finding the first derivative, second and third.
In this paper we estimate the coefficients and scale parameter in linear regression model depending on the residuals are of type 1 of extreme value distribution for the largest values . This can be regard as an improvement for the studies with the smallest values . We study two estimation methods ( OLS & MLE ) where we resort to Newton – Raphson (NR) and Fisher Scoring methods to get MLE estimate because the difficulty of using the usual approach with MLE . The relative efficiency criterion is considered beside to the statistical inference procedures for the extreme value regression model of type 1 for largest values . Confidence interval , hypothesis testing for both scale parameter and regression coefficients
... Show Morein this paper the collocation method will be solve ordinary differential equations of retarted arguments also some examples are presented in order to illustrate this approach
This study is dedicated to solving multicollinearity problem for the general linear model by using Ridge regression method. The basic formulation of this method and suggested forms for Ridge parameter is applied to the Gross Domestic Product data in Iraq. This data has normal distribution. The best linear regression model is obtained after solving multicollinearity problem with the suggesting of 10 k value.
The purpose behind building the linear regression model is to describe the real linear relation between any explanatory variable in the model and the dependent one, on the basis of the fact that the dependent variable is a linear function of the explanatory variables and one can use it for prediction and control. This purpose does not cometrue without getting significant, stable and reasonable estimatros for the parameters of the model, specifically regression-coefficients. The researcher found that "RUF" the criterian that he had suggested accurate and sufficient to accomplish that purpose when multicollinearity exists provided that the adequate model that satisfies the standard assumpitions of the error-term can be assigned. It
... Show More