Regression Discontinuity (RD) means a study that exposes a definite group to the effect of a treatment. The uniqueness of this design lies in classifying the study population into two groups based on a specific threshold limit or regression point, and this point is determined in advance according to the terms of the study and its requirements. Thus , thinking was focused on finding a solution to the issue of workers retirement and trying to propose a scenario to attract the idea of granting an end-of-service reward to fill the gap ( discontinuity point) if it had not been granted. The regression discontinuity method has been used to study and to estimate the effect of the end -service reward on the cutoff of insured workers as well as the increase in revenues resulting from that. The research has showed that this reward has a clear effect on increasing revenues due to the regularity of workers in their work and their work continuity . It has also found that using Local Linear Smother (LLS) by using three models of bandwidth selection. Its results after the analysis in the Regression program have been as follows: The CCT (Calonico, Cattaneo & Titiunik) beamwidth gives the best performance followed by the local linear regression using the LK (Lembens and kalyanman) beamwidth. The real data has been used in sample size 71 represented in compensation as a variable of effectiveness (illustrative) X and the revenue as a result or an approved variable Y, while the results of the traditional OLS estimation method have not been good enough.
Abstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
The dynamic thermomechanical properties, sealing ability, and voids formation of an experimental obturation hydroxyapatite-reinforced polyethylene (HA/PE) composite/carrier system were investigated and compared with those of a commercial system [GuttaCore (GC)]. The HA/PE system was specifically designed using a melt-extrusion process. The viscoelastic properties of HA/PE were determined using a dynamic thermomechanical analyser. Human single-rooted teeth were endodontically instrumented and obturated using HA/PE or GC systems, and then sealing ability was assessed using a fluid filtration system. In addition, micro-computed tomography (μCT) was used to quantify apparent voids within the root-canal space. The data were statistically analys
... Show MoreIncremental Sheet Metal Forming (ISMF) is a modern sheet metal forming technology which offers the possibility of manufacturing 3D complex parts of thin sheet metals using the CNC milling machine. The surface quality is a very important aspect in any manufacturing process. Therefore, this study focuses on the resultant residual stresses by forming parameters, namely; (tool shape, step over, feed rate, and slope angle) using Taguchi method for the products formed by single point incremental forming process (SPIF). For evaluating the surface quality, practical experiments to produce pyramid like shape have been implemented on aluminum sheets (AA1050) for thickness (0.9) mm. Three types of tool shape used in this work, the spherical tool ga
... Show MoreA novel median filter based on crow optimization algorithms (OMF) is suggested to reduce the random salt and pepper noise and improve the quality of the RGB-colored and gray images. The fundamental idea of the approach is that first, the crow optimization algorithm detects noise pixels, and that replacing them with an optimum median value depending on a criterion of maximization fitness function. Finally, the standard measure peak signal-to-noise ratio (PSNR), Structural Similarity, absolute square error and mean square error have been used to test the performance of suggested filters (original and improved median filter) used to removed noise from images. It achieves the simulation based on MATLAB R2019b and the resul
... Show MoreAuthentication is the process of determining whether someone or something is, in fact, who or what it is declared to be. As the dependence upon computers and computer networks grows, the need for user authentication has increased. User’s claimed identity can be verified by one of several methods. One of the most popular of these methods is represented by (something user know), such as password or Personal Identification Number (PIN). Biometrics is the science and technology of authentication by identifying the living individual’s physiological or behavioral attributes. Keystroke authentication is a new behavioral access control system to identify legitimate users via their typing behavior. The objective of this paper is to provide user
... Show MoreIris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the origin
... Show More