This paper discusses estimating the two scale parameters of Exponential-Rayleigh distribution for singly type one censored data which is one of the most important Rights censored data, using the maximum likelihood estimation method (MLEM) which is one of the most popular and widely used classic methods, based on an iterative procedure such as the Newton-Raphson to find estimated values for these two scale parameters by using real data for COVID-19 was taken from the Iraqi Ministry of Health and Environment, AL-Karkh General Hospital. The duration of the study was in the interval 4/5/2020 until 31/8/2020 equivalent to 120 days, where the number of patients who entered the (study) hospital with sample size is (n=785). The number of patients who died during the period of study was (m=88). And the number of patients who survived during the study period was (n-m=697), then utilized one of the most important non-parametric tests which is the Chi-square test to determine if the sample (data) corresponded with the Exponential-Rayleigh distribution (ER). then, after estimating the parameters of ER distribution for singly type-I censoring data, compute the survival function, hazard function, and probability density function.
The aim of this research was to estimate the production function to measure returns to scale and distribution efficiency of resources used in the production of wheat. Cross sectional data used of a random sample of 130 farmers in Dhi Qar Province. The results of the quantitative analysis of estimating production function showed that the double logarithmic form was the best estimated model based on economic and statistical indicators. However, that form suffered from heteroscedasticity and autocorrelation, so the robust regression technique was chosen. Value of returns to scale was 0.89 and this indicates decreasing returns to scale. This means that production function is in the second stage of the function. The results of the dist
... Show MoreThis research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear
... Show MoreToday, the role of cloud computing in our day-to-day lives is very prominent. The cloud computing paradigm makes it possible to provide demand-based resources. Cloud computing has changed the way that organizations manage resources due to their robustness, low cost, and pervasive nature. Data security is usually realized using different methods such as encryption. However, the privacy of data is another important challenge that should be considered when transporting, storing, and analyzing data in the public cloud. In this paper, a new method is proposed to track malicious users who use their private key to decrypt data in a system, share it with others and cause system information leakage. Security policies are also considered to be int
... Show MoreThere has been a growing interest in the use of chaotic techniques for enabling secure communication in recent years. This need has been motivated by the emergence of a number of wireless services which require the channel to provide low bit error rates (BER) along with information security. The aim of such activity is to steal or distort the information being conveyed. Optical Wireless Systems (basically Free Space Optic Systems, FSO) are no exception to this trend. Thus, there is an urgent necessity to design techniques that can secure privileged information against unauthorized eavesdroppers while simultaneously protecting information against channel-induced perturbations and errors. Conventional cryptographic techniques are not designed
... Show MoreThe main objective of this research is to design and select a composite plate to be used in fabricating wing skins of light unman air vehicle (UAV). The mechanical properties, weight and cost are the basis criteria of this selection. The fiber volume fraction, fillers and type of fiber with three levels for each were considered to optimize the composite plate selection. Finite element method was used to investigate the stress distribution on the wing at cruise flight condition in addition to estimate the maximum stress. An experiments plan has been designed to get the data on the basis of Taguchi technique. The most effective parameters at the process to be find out by employing L9
... Show MoreIn this paper, Nordhaus-Gaddum type relations on open support independence number of some derived graphs of path related graphs under addition and multiplication are studied.
Objective This research investigates Breast Cancer real data for Iraqi women, these data are acquired manually from several Iraqi Hospitals of early detection for Breast Cancer. Data mining techniques are used to discover the hidden knowledge, unexpected patterns, and new rules from the dataset, which implies a large number of attributes. Methods Data mining techniques manipulate the redundant or simply irrelevant attributes to discover interesting patterns. However, the dataset is processed via Weka (The Waikato Environment for Knowledge Analysis) platform. The OneR technique is used as a machine learning classifier to evaluate the attribute worthy according to the class value. Results The evaluation is performed using
... Show MoreAtenolol was used with ammonium molybdate to prove the efficiency, reliability and repeatability of the long distance chasing photometer (NAG-ADF-300-2) using continuous flow injection analysis. The method is based on reaction between atenolol and ammonium molybdate in an aqueous medium to obtain a dark brown precipitate. Optimum parameters was studied to increase the sensitivity for developed method. A linear range for calibration graph was 0.1-3.5 mmol/L for cell A and 0.3-3.5 mmol/L for cell B, and LOD 133.1680 ng/100 µL and 532.6720 ng/100 µL for cell A and cell B respectively with correlation coefficient (r) 0.9910 for cell A and 0.9901 for cell B, RSD% was lower than 1%, (n=8) for the determination of ate
... Show MoreDiabetes mellitus type 2 (T2DM) is a chronic and progressive condition, which affects people all around the world. The risk of complications increases with age if the disease is not managed properly. Diabetic neuropathy is caused by excessive blood glucose and lipid levels, resulting in nerve damage. Apelin is a peptide hormone that is found in different human organs, including the central nervous system and adipose tissue. The aim of this study is to estimate Apelin levels in diabetes type 2 and Diabetic peripheral Neuropathy (DPN) Iraqi patients and show the extent of peripheral nerve damage. The current study included 120 participants: 40 patients with Diabetes Mellitus, 40 patients with Diabetic peripheral Neuropathy, and 40 healthy
... Show MoreUnconfined compressive strength (UCS) of rock is the most critical geomechanical property widely used as input parameters for designing fractures, analyzing wellbore stability, drilling programming and carrying out various petroleum engineering projects. The USC regulates rock deformation by measuring its strength and load-bearing capacity. The determination of UCS in the laboratory is a time-consuming and costly process. The current study aims to develop empirical equations to predict UCS using regression analysis by JMP software for the Khasib Formation in the Buzurgan oil fields, in southeastern Iraq using well-log data. The proposed equation accuracy was tested using the coefficient of determination (R²), the average absolute
... Show More