Preferred Language
Articles
/
bsj-4154
Hazard Rate Estimation Using Varying Kernel Function for Censored Data Type I
...Show More Authors

     In this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used:  local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the types of the kernel boundary functions and suggested that the 2xRectangle and 2xEpanechnikov methods reflect the best results if compared to the other estimators.

View Publication Preview PDF
Quick Preview PDF
Publication Date
Sun Aug 30 2020
Journal Name
Periodicals Of Engineering And Natural Sciences (pen)
Kernel estimation of returns of retirement funds of employers based on monetary earnings (subscriptions and compensation) via regression discontinuity in Iraq
...Show More Authors

Regression Discontinuity (RD) means a study that exposes a definite group to the effect of a treatment. The uniqueness of this design lies in classifying the study population into two groups based on a specific threshold limit or regression point, and this point is determined in advance according to the terms of the study and its requirements. Thus , thinking was focused on finding a solution to the issue of workers retirement and trying to propose a scenario to attract the idea of granting an end-of-service reward to fill the gap ( discontinuity point) if it had not been granted. The regression discontinuity method has been used to study and to estimate the effect of the end -service reward on the cutoff of insured workers as well as t

... Show More
View Publication Preview PDF
Publication Date
Wed Oct 17 2018
Journal Name
Journal Of Economics And Administrative Sciences
New Robust Estimation in Compound Exponential Weibull-Poisson Distribution for both contaminated and non-contaminated Data
...Show More Authors

Abstract

The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.

 

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sat Jun 30 2012
Journal Name
Al-kindy College Medical Journal
Erythrocyte Magnesium Levels in type I and type II Iraq diabetic patients effect of antidiabetic treatment
...Show More Authors

Background: Direct measurement of intracellular magnesium using erythrocytes has been suggested as a sensitive indicator for the estimation of body magnesium store. Marked depletion in plasma and erythrocyte magnesium levels was particularly evident in diabetic patients with advanced retinopathy and poor diabetic control. While insulin has been shown to stimulate erythrocyte magnesium uptake, hyperglycemia per se suppressed intracellular magnesium in normal human red cells.
Aim of the study: To investigate the erythrocyte magnesium level in Iraqi type I and II diabetic patients, with specific emphasis on the effect of both, metabolic control and the type of antidiabetic treatments.
Methods: Sixty two diabetic patients (7 with type

... Show More
View Publication Preview PDF
Publication Date
Tue Oct 01 2013
Journal Name
Proceedings Of The International Astronomical Union
The infrared <i>K</i>-band identification of the DSO/G2 source from VLT and Keck data
...Show More Authors
Abstract<p>A fast moving infrared excess source (G2) which is widely interpreted as a core-less gas and dust cloud approaches Sagittarius A* (Sgr A*) on a presumably elliptical orbit. VLT <italic>K<sub>s</sub></italic>-band and Keck <italic>K</italic>′-band data result in clear continuum identifications and proper motions of this ∼19<sup><italic>m</italic></sup> Dusty S-cluster Object (DSO). In 2002-2007 it is confused with the star S63, but free of confusion again since 2007. Its near-infrared (NIR) colors and a comparison to other sources in the field speak in favor of the DSO being an IR excess star with photospheric continuum emission at 2 microns than a</p> ... Show More
View Publication
Scopus (5)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Tue Oct 23 2018
Journal Name
Journal Of Economics And Administrative Sciences
Processing of missing values in survey data using Principal Component Analysis and probabilistic Principal Component Analysis methods
...Show More Authors

The idea of ​​carrying out research on incomplete data came from the circumstances of our dear country and the horrors of war, which resulted in the missing of many important data and in all aspects of economic, natural, health, scientific life, etc.,. The reasons for the missing are different, including what is outside the will of the concerned or be the will of the concerned, which is planned for that because of the cost or risk or because of the lack of possibilities for inspection. The missing data in this study were processed using Principal Component  Analysis and self-organizing map methods using simulation. The variables of child health and variables affecting children's health were taken into account: breastfeed

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Jun 05 2023
Journal Name
Journal Of Economics And Administrative Sciences
Estimating the Population Mean in Stratified Random Sampling Using Combined Regression with the Presence of Outliers
...Show More Authors

In this research, the covariance estimates were used to estimate the population mean in the stratified random sampling and combined regression estimates. were compared by employing the robust variance-covariance matrices estimates with combined regression estimates by employing the traditional variance-covariance matrices estimates when estimating the regression parameter, through the two efficiency criteria (RE) and mean squared error (MSE). We found that robust estimates significantly improved the quality of combined regression estimates by reducing the effect of outliers using robust covariance and covariance matrices estimates (MCD, MVE) when estimating the regression parameter. In addition, the results of the simulation study proved

... Show More
View Publication Preview PDF
Crossref (3)
Crossref
Publication Date
Fri Sep 30 2022
Journal Name
Journal Of Economics And Administrative Sciences
Estimation of Reliability through the Wiener Degradation Process Based on the Genetic Algorithm to Estimating Parameters
...Show More Authors

      In this paper, the researcher suggested using the Genetic algorithm method to estimate the parameters of the Wiener degradation process,  where it is based on the Wiener process in order to estimate the reliability of high-efficiency products, due to the difficulty of estimating the reliability of them using traditional techniques that depend only on the failure times of products. Monte Carlo simulation has been applied for the purpose of proving the efficiency of the proposed method in estimating parameters; it was compared with the method of the maximum likelihood estimation. The results were that the Genetic algorithm method is the best based on the AMSE comparison criterion, then the reliab

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sat Sep 11 2010
Journal Name
Journal Of Al-nahrain University
ESTIMATION ACTIVITY OF LAP IN PATIENT S WITH TYPE 2 DIABETES BY USING LEUCINE AMIDE AS SUBSTRATE
...Show More Authors

This study was performd on 50 serum specimens of patients with type 2 diabetes, in addition, 50 normal specimens were investigated as control group. The activity rate of LAP in patients (560.46 10.504) I.U/L and activity rate of LAP in healthy(10.58 4.39)I.U/L.The results of the study reveal that Leucine aminopeptidase (LAP) activity of type 2 diabetes patient s serum shows a high signifiacant increase (p < 0.001) compare to healthy subjects. Addition preparation leucine amide as substrate of LAP, identification melting point and spectra by FTIR. K

Publication Date
Fri Jul 01 2016
Journal Name
Journal Of Economics And Administrative Sciences
Comparison some of methods wavelet estimation for non parametric regression function with missing response variable at random
...Show More Authors

Abstract

 The problem of missing data represents a major obstacle before researchers in the process of data analysis in different fields since , this problem is a recurrent one in all fields of study including social , medical , astronomical and clinical experiments .

The presence of such a problem within the data to be studied may influence negatively on the analysis and it may lead to misleading conclusions , together with the fact that these conclusions that result from a great bias caused by that problem in spite of the efficiency of wavelet methods but they are also affected by the missing of data , in addition to the impact of the problem of miss of accuracy estimation

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sat Feb 01 2025
Journal Name
Algorithms
Three-Dimensional Object Recognition Using Orthogonal Polynomials: An Embedded Kernel Approach
...Show More Authors

Computer vision seeks to mimic the human visual system and plays an essential role in artificial intelligence. It is based on different signal reprocessing techniques; therefore, developing efficient techniques becomes essential to achieving fast and reliable processing. Various signal preprocessing operations have been used for computer vision, including smoothing techniques, signal analyzing, resizing, sharpening, and enhancement, to reduce reluctant falsifications, segmentation, and image feature improvement. For example, to reduce the noise in a disturbed signal, smoothing kernels can be effectively used. This is achievedby convolving the distributed signal with smoothing kernels. In addition, orthogonal moments (OMs) are a cruc

... Show More
View Publication
Scopus (5)
Crossref (5)
Scopus Clarivate Crossref