In this research, we use fuzzy nonparametric methods based on some smoothing techniques, were applied to real data on the Iraqi stock market especially the data about Baghdad company for soft drinks for the year (2016) for the period (1/1/2016-31/12/2016) .A sample of (148) observations was obtained in order to construct a model of the relationship between the stock prices (Low, high, modal) and the traded value by comparing the results of the criterion (G.O.F.) for three techniques , we note that the lowest value for this criterion was for the K-Nearest Neighbor at Gaussian function .
It is well-known that the existence of outliers in the data will adversely affect the efficiency of estimation and results of the current study. In this paper four methods will be studied to detect outliers for the multiple linear regression model in two cases : first, in real data; and secondly, after adding the outliers to data and the attempt to detect it. The study is conducted for samples with different sizes, and uses three measures for comparing between these methods . These three measures are : the mask, dumping and standard error of the estimate.
In this paper two ranking functions are employed to treat the fuzzy multiple objective (FMO) programming model, then using two kinds of membership function, the first one is trapezoidal fuzzy (TF) ordinary membership function, the second one is trapezoidal fuzzy weighted membership function. When the objective function is fuzzy, then should transform and shrinkage the fuzzy model to traditional model, finally solving these models to know which one is better
The purpose of this paper is to define fuzzy subspaces for fuzzy space of orderings and we prove some results about this definition in which it leads to a lot of new results on fuzzy space of orderings. Also we define the sum and product over such spaces such that: If f = < a1,…,an > and g = < b1,…bm>, their sum and product are f + g = < a1…,an, b1, …, bm> and f × g =
In this paper, we derived an estimator of reliability function for Laplace distribution with two parameters using Bayes method with square error loss function, Jeffery’s formula and conditional probability random variable of observation. The main objective of this study is to find the efficiency of the derived Bayesian estimator compared to the maximum likelihood of this function and moment method using simulation technique by Monte Carlo method under different Laplace distribution parameters and sample sizes. The consequences have shown that Bayes estimator has been more efficient than the maximum likelihood estimator and moment estimator in all samples sizes
Most statistical research generally relies on the study of the behaviour of different phenomena during specific time periods and the use of the results of these studies in the development of appropriate recommendations and decision-making and for the purpose of statistical inference on the parameters of the statistical distribution of life times in The technical staff of most of the manufacturers in the research units of these companies deals with censored data, the main objective of the study of survival is the need to provide information that is the basis for decision making and must clarify the problem and then the goals and limitations of this study and that It may have different possibilities to perform the
... Show MoreThe behavior and shear strength of full-scale (T-section) reinforced concrete deep beams, designed according to the strut-and-tie approach of ACI Code-19 specifications, with various large web openings were investigated in this paper. A total of 7 deep beam specimens with identical shear span-to-depth ratios have been tested under mid-span concentrated load applied monotonically until beam failure. The main variables studied were the effects of width and depth of the web openings on deep beam performance. Experimental data results were calibrated with the strut-and-tie approach, adopted by ACI 318-19 code for the design of deep beams. The provided strut-and-tie design model in ACI 318-19 code provision was assessed and found to be u
... Show MoreThis research introduce a study with application on Principal Component Regression obtained from some of the explainatory variables to limitate Multicollinearity problem among these variables and gain staibilty in their estimations more than those which yield from Ordinary Least Squares. But the cost that we pay in the other hand losing a little power of the estimation of the predictive regression function in explaining the essential variations. A suggested numerical formula has been proposed and applied by the researchers as optimal solution, and vererifing the its efficiency by a program written by the researchers themselves for this porpuse through some creterions: Cumulative Percentage Variance, Coefficient of Determination, Variance
... Show MoreIn the current worldwide health crisis produced by coronavirus disease (COVID-19), researchers and medical specialists began looking for new ways to tackle the epidemic. According to recent studies, Machine Learning (ML) has been effectively deployed in the health sector. Medical imaging sources (radiography and computed tomography) have aided in the development of artificial intelligence(AI) strategies to tackle the coronavirus outbreak. As a result, a classical machine learning approach for coronavirus detection from Computerized Tomography (CT) images was developed. In this study, the convolutional neural network (CNN) model for feature extraction and support vector machine (SVM) for the classification of axial
... Show MoreSymmetric cryptography forms the backbone of secure data communication and storage by relying on the strength and randomness of cryptographic keys. This increases complexity, enhances cryptographic systems' overall robustness, and is immune to various attacks. The present work proposes a hybrid model based on the Latin square matrix (LSM) and subtractive random number generator (SRNG) algorithms for producing random keys. The hybrid model enhances the security of the cipher key against different attacks and increases the degree of diffusion. Different key lengths can also be generated based on the algorithm without compromising security. It comprises two phases. The first phase generates a seed value that depends on producing a rand
... Show More