In this article we study the variance estimator for the normal distribution when the mean is un known depend of the cumulative function between unbiased estimator and Bays estimator for the variance of normal distribution which is used include Double Stage Shrunken estimator to obtain higher efficiency for the variance estimator of normal distribution when the mean is unknown by using small volume equal volume of two sample .
The study of economic growth indicators is of fundamental importance in estimating the effectiveness of economic development plans, as well as the great role it plays in determining appropriate economic policies in order to optimally use the factors that lead to the dynamics of growth in Iraq, especially during a certain period of time. The gross domestic product (GDP) at current prices), which is considered a part of the national accounts, which is considered as an integrated dynamic of statistics that produces in front of policy makers the possibility of determining whether the economy is witnessing a state of expansion or evaluating economic activity and its efficiency in order to reach the size of the overall economy. The research aims
... Show MoreIn this research we been estimated the survival function for data suffer from the disturbances and confusion of Iraq Household Socio-Economic Survey: IHSES II 2012 , to data from a five-year age groups follow the distribution of the Generalized Gamma: GG. It had been used two methods for the purposes of estimating and fitting which is the way the Principle of Maximizing Entropy: POME, and method of booting to nonparametric smoothing function for Kernel, to overcome the mathematical problems plaguing integrals contained in this distribution in particular of the integration of the incomplete gamma function, along with the use of traditional way in which is the Maximum Likelihood: ML. Where the comparison on the basis of the method of the Cen
... Show MoreTheresearch took the spatial autoregressive model: SAR and spatial error model: SEM in an attempt to provide a practical evident that proves the importance of spatial analysis, with a particular focus on the importance of using regression models spatial andthat includes all of them spatial dependence, which we can test its presence or not by using Moran test. While ignoring this dependency may lead to the loss of important information about the phenomenon under research is reflected in the end on the strength of the statistical estimation power, as these models are the link between the usual regression models with time-series models. Spatial analysis had
... Show MoreThe aim of this research is to use robust technique by trimming, as the analysis of maximum likelihood (ML) often fails in the case of outliers in the studied phenomenon. Where the (MLE) will lose its advantages because of the bad influence caused by the Outliers. In order to address this problem, new statistical methods have been developed so as not to be affected by the outliers. These methods have robustness or resistance. Therefore, maximum trimmed likelihood: (MTL) is a good alternative to achieve more results. Acceptability and analogies, but weights can be used to increase the efficiency of the resulting capacities and to increase the strength of the estimate using the maximum weighted trimmed likelihood (MWTL). In order to perform t
... Show MoreThe need for detection and investigation of the causes of pollution of the marshes and submit a statistical study evaluated accurately and submitted to the competent authorities and to achieve this goal was used to analyze the factorial analysis and then obtained the results from this analysis from a sample selected from marsh water pollutants which they were: (Electrical Conductivity: EC, Power of Hydrogen: PH, Temperature: T, Turbidity: TU, Total Dissolved Solids: TDS, Dissolved Oxygen: DO). The size of sample (44) sites has been withdrawn and examined in the laboratories of the Iraqi Ministry of Environment. By illustrating SPSS program) the results had been obtained. The most important recommendation was to increase the pumping of addit
... Show MoreExamination of skewness makes academics more aware of the importance of accurate statistical analysis. Undoubtedly, most phenomena contain a certain percentage of skewness which resulted to the appearance of what is -called "asymmetry" and, consequently, the importance of the skew normal family . The epsilon skew normal distribution ESN (μ, σ, ε) is one of the probability distributions which provide a more flexible model because the skewness parameter provides the possibility to fluctuate from normal to skewed distribution. Theoretically, the estimation of linear regression model parameters, with an average error value that is not zero, is considered a major challenge due to having difficulties, as no explicit formula to calcula
... Show MoreThis research aims to provide insight into the Spatial Autoregressive Quantile Regression model (SARQR), which is more general than the Spatial Autoregressive model (SAR) and Quantile Regression model (QR) by integrating aspects of both. Since Bayesian approaches may produce reliable estimates of parameter and overcome the problems that standard estimating techniques, hence, in this model (SARQR), they were used to estimate the parameters. Bayesian inference was carried out using Markov Chain Monte Carlo (MCMC) techniques. Several criteria were used in comparison, such as root mean squared error (RMSE), mean absolute percentage error (MAPE), and coefficient of determination (R^2). The application was devoted on dataset of poverty rates acro
... Show MoreTime series have gained great importance and have been applied in a manner in the economic, financial, health and social fields and used in the analysis through studying the changes and forecasting the future of the phenomenon. One of the most important models of the black box is the "ARMAX" model, which is a mixed model consisting of self-regression with moving averages with external inputs. It consists of several stages, namely determining the rank of the model and the process of estimating the parameters of the model and then the prediction process to know the amount of compensation granted to workers in the future in order to fulfil the future obligations of the Fund. , And using the regular least squares method and the frequ
... Show MoreThis Book is intended to be a textbook studied for undergraduate course in financial statistics/ department of Financial Sciences and Banking. This book is designed to be used in semester system. To achieve the goals of the book, it is divided into the following chapters. Chapter one introduces basic concepts. Chapter two devotes to frequency distribution and data representation. Chapter three discusses central tendency measures (all types of means, mode, and median). Chapter four deals with dispersion Measures (standard deviation, variance, and coefficient of variation). Chapter five concerned with correlation and regression analysis. While chapter six concerned with testing Hypotheses (One population mean test, Two "independent" populati
... Show MoreThis Book is the second edition that intended to be textbook studied for undergraduate/ postgraduate course in mathematical statistics. In order to achieve the goals of the book, it is divided into the following chapters. Chapter One introduces events and probability review. Chapter Two devotes to random variables in their two types: discrete and continuous with definitions of probability mass function, probability density function and cumulative distribution function as well. Chapter Three discusses mathematical expectation with its special types such as: moments, moment generating function and other related topics. Chapter Four deals with some special discrete distributions: (Discrete Uniform, Bernoulli, Binomial, Poisson, Geometric, Neg
... Show Moreالمستخلص يهدف هذا البحث الى تجاوز مشكلة البعدية من خلال طرائق الانحدار اللامعلمي والتي تعمل على تقليل جذر متوسط الخطأ التربيعي (RMSE) , أذ تم استعمال طريقة انحدار الاسقاطات المتلاحقة (PPR) ,والتي تعتبر احدى طرائق اختزال الابعاد التي تعمل على تجاوز مشكلة البعدية (curse of dimensionality) , وان طريقة (PPR) من التقنيات الاحصائية التي تهتم بأيجاد الاسقاطات الاكثر أهمية في البيانات المتعددة الابعاد , ومع ايجاد كل اسقاط
... Show MoreA binary stream cipher cryptosystem can be used to encrypt/decrypt many types of digital files, especially those can be considered huge data files like images. To guarantee that the encryption or decryption processes need a reasonable time to encrypt/decrypt the images, so we have to make the stream cipher key generator that acts quickly without effect in the complexity or randomness of the output key binary sequences. In this paper, we increase the size of the output sequence from binary to digital sequence in the field to obtain byte sequence, then we test this new sequence not only as binary but also -sequence. So we have to test the new output sequence in the new mathematical field. This is done by changing the base of the
... Show MoreAbstract Software-Defined Networking (commonly referred to as SDN) is a newer paradigm that develops the concept of a software-driven network by separating data and control planes. It can handle the traditional network problems. However, this excellent architecture is subjected to various security threats. One of these issues is the distributed denial of service (DDoS) attack, which is difficult to contain in this kind of software-based network. Several security solutions have been proposed recently to secure SDN against DDoS attacks. This paper aims to analyze and discuss machine learning-based systems for SDN security networks from DDoS attack. The results have indicated that the algorithms for machine learning can be used to detect DDoS
... Show MoreThe challenge to incorporate usability evaluation values and practices into agile development process is not only persisting but also systemic. Notable contributions of researchers have attempted to isolate and close the gaps between both fields, with the aim of developing usable software. Due to the current absence of a reference model that specifies where and how usability activities need to be considered in the agile development process. This paper proposes a model for identifying appropriate usability evaluation methods alongside the agile development process. By using this model, the development team can apply usability evaluations at the right time at the right place to get the necessary feedback from the end-user. Verificatio
... Show More<span lang="EN-US">Usability evaluation is a core usability activity that minimizes risks and improves product quality. The returns from usability evaluation are undeniable. Neglecting such evaluation at the development stage negatively affects software usability. In this paper, the authors develop a software management tool used to incorporate usability evaluation activities into the agile environment. Using this tool, agile development teams can manage a continuous evaluation process, tightly coupled with the development process, allowing them to develop high quality software products with adequate level of usability. The tool was evaluated through verification, followed by the validation on satisfaction. The evaluation resu
... Show MoreA stochastic process {Xk, k = 1, 2, ...} is a doubly geometric stochastic process if there exists the ratio (a > 0) and the positive function (h(k) > 0), so that {α 1 h-k }; k ak X k = 1, 2, ... is a generalization of a geometric stochastic process. This process is stochastically monotone and can be used to model a point process with multiple trends. In this paper, we use nonparametric methods to investigate statistical inference for doubly geometric stochastic processes. A graphical technique for determining whether a process is in agreement with a doubly geometric stochastic process is proposed. Further, we can estimate the parameters a, b, μ and σ2 of the doubly geometric stochastic process by using the least squares estimate for Xk a
... Show MoreIn this paper new methods were presented based on technique of differences which is the difference- based modified jackknifed generalized ridge regression estimator(DMJGR) and difference-based generalized jackknifed ridge regression estimator(DGJR), in estimating the parameters of linear part of the partially linear model. As for the nonlinear part represented by the nonparametric function, it was estimated using Nadaraya Watson smoother. The partially linear model was compared using these proposed methods with other estimators based on differencing technique through the MSE comparison criterion in simulation study.
The purpose of this paper is to model and forecast the white oil during the period (2012-2019) using volatility GARCH-class. After showing that squared returns of white oil have a significant long memory in the volatility, the return series based on fractional GARCH models are estimated and forecasted for the mean and volatility by quasi maximum likelihood QML as a traditional method. While the competition includes machine learning approaches using Support Vector Regression (SVR). Results showed that the best appropriate model among many other models to forecast the volatility, depending on the lowest value of Akaike information criterion and Schwartz information criterion, also the parameters must be significant. In addition, the residuals
... Show MoreThis paper study two stratified quantile regression models of the marginal and the conditional varieties. We estimate the quantile functions of these models by using two nonparametric methods of smoothing spline (B-spline) and kernel regression (Nadaraya-Watson). The estimates can be obtained by solve nonparametric quantile regression problem which means minimizing the quantile regression objective functions and using the approach of varying coefficient models. The main goal is discussing the comparison between the estimators of the two nonparametric methods and adopting the best one between them
In this paper, we build a fuzzy classification system for classifying the nutritional status of children under 5 years old in Iraq using the Mamdani method based on input variables such as weight and height to determine the nutritional status of the child. Also, Classifying the nutritional status faces a difficult challenge in the medical field due to uncertainty and ambiguity in the variables and attributes that determine the categories of nutritional status for children, which are relied upon in medical diagnosis to determine the types of malnutrition problems and identify the categories or groups suffering from malnutrition to determine the risks faced by each group or category of children. Malnutrition in children is one of the most
... Show Moreيتكون الانحدار المقسم من عدة أقسام تفصل بينها نقاط انتماء مختلفة، فتظهر حالة عدم التجانس الناشئة من عملية فصل الأقسام ضمن عينة البحث. ويهتم هذا البحث في تقدير موقع نقطة التغيير بين الأقسام وتقدير معلمات الأنموذج، واقتراح طريقة تقدير حصينة ومقارنتها مع بعض الطرائق المستعملة في الانحدار الخطي المقسم. وقد تم استعمال أحد الطرائق التقليدية (طريقة Muggeo) لإيجاد مقدرات الإمكان الأعظم بالأسلوب الت
... Show MoreMultilocus haplotype analysis of candidate variants with genome wide association studies (GWAS) data may provide evidence of association with disease, even when the individual loci themselves do not. Unfortunately, when a large number of candidate variants are investigated, identifying risk haplotypes can be very difficult. To meet the challenge, a number of approaches have been put forward in recent years. However, most of them are not directly linked to the disease-penetrances of haplotypes and thus may not be efficient. To fill this gap, we propose a mixture model-based approach for detecting risk haplotypes. Under the mixture model, haplotypes are clustered directly according to their estimated d
Use of lower squares and restricted boxes
In the estimation of the first-order self-regression parameter
AR (1) (simulation study)
تم في هذا البحث دراسة انموذج متعدد المستوى (انموذج التجميع الجزئي) الذي يعد احد اهم النماذج واسعة الاستعمال والتطبيق في تحليل البيانات التي تتصف بكون المشاهدات فيها تأخذ شكلاً هرمياً او هيكلياً, اذ تم استعمال نماذج التجميع الجزئي وتم تقدير معلمات نماذج التجميع الجزئي (الثابتة والعشوائية) وذلك باستعمال طريقة الامكان الاعظم الكاملة FML وتم اجراء مقارنة بين افضلية هذه النماذج في الجانب التطبيقي الذي تضمن ال
... Show MoreThe estimation of the regular regression model requires several assumptions to be satisfied such as "linearity". One problem occurs by partitioning the regression curve into two (or more) parts and then joining them by threshold point(s). This situation is regarded as a linearity violation of regression. Therefore, the multiphase regression model is received increasing attention as an alternative approach which describes the changing of the behavior of the phenomenon through threshold point estimation. Maximum likelihood estimator "MLE" has been used in both model and threshold point estimations. However, MLE is not resistant against violations such as outliers' existence or in case of the heavy-tailed error distribution. The main goal of t
... Show MoreThe repeated measurement design is called a complete randomized block design for repeated measurement when the subject is given the all different treatments , in this case the subject is considered as a block . Many of nonparametric methods were considered like Friedman test (1937) and Koch test(1969) and Kepner&Robinson test(1988) when the assumption of normal distribution of the data is not satisfied .as well as F test when the assumptions of the analysis of variance is satisfied ,where the observations within blocks are assumed to be equally correlated . The purpose of this paper is to summarize the result of the simulation study for comparing these methods as well as present the suggested
Me
... Show More