Abstract
Binary logistic regression model used in data classification and it is the strongest most flexible tool in study cases variable response binary when compared to linear regression. In this research, some classic methods were used to estimate parameters binary logistic regression model, included the maximum likelihood method, minimum chi-square method, weighted least squares, with bayes estimation , to choose the best method of estimation by default values to estimate parameters according two different models of general linear regression models ,and different s
... Show MoreIn this research, we dealt with the study of the Non-Homogeneous Poisson process, which is one of the most important statistical issues that have a role in scientific development as it is related to accidents that occur in reality, which are modeled according to Poisson’s operations, because the occurrence of this accident is related to time, whether with the change of time or its stability. In our research, this clarifies the Non-Homogeneous hemispheric process and the use of one of these models of processes, which is an exponentiated - Weibull model that contains three parameters (α, β, σ) as a function to estimate the time rate of occurrence of earthquakes in Erbil Governorate, as the governorate is adjacent to two countr
... Show MoreThis paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with
... Show MoreSingle Point Incremental Forming (SPIF) is a forming technique of sheet material based on layered manufacturing principles. The sheet part is locally deformed through horizontal slices. The moving locus of forming tool (called as toolpath) in these slices constructed to the finished part was performed by the CNC technology. The toolpath was created directly from CAD model of final product. The forming tool is a Ball-end forming tool, which was moved along the toolpath while the edges of sheet material were clamped rigidly on fixture.
This paper presented an investigation study of thinning distribution of a conical shapes carried out by incremental forming and the validation of finite element method to evaluate the limits of the p
... Show MoreThe aim of this research is to use robust technique by trimming, as the analysis of maximum likelihood (ML) often fails in the case of outliers in the studied phenomenon. Where the (MLE) will lose its advantages because of the bad influence caused by the Outliers. In order to address this problem, new statistical methods have been developed so as not to be affected by the outliers. These methods have robustness or resistance. Therefore, maximum trimmed likelihood: (MTL) is a good alternative to achieve more results. Acceptability and analogies, but weights can be used to increase the efficiency of the resulting capacities and to increase the strength of the estimate using the maximum weighted trimmed likelihood (MWTL). In order to perform t
... Show MoreThe two-dimensional transient heat conduction through a thermal insulation of temperature dependent thermal properties is investigated numerically using the FVM. It is assumed that this insulating material is initially at a uniform temperature. Then, it is suddenly subjected at its inner surface with a step change in temperature and subjected at its outer surface with a natural convection boundary condition associated with a periodic change in ambient temperature and heat flux of solar radiation. Two thermal insulation materials were selected. The fully implicit time scheme is selected to represent the time discretization. The arithmetic mean thermal conductivity is chosen to be the value of the approximated thermal conductivity at the i
... Show MoreAbstract
The current research aims to examine the effect of the Adi and Shayer model on the achievement of fifth-grade students and their attitudes toward history. To achieve the research objective, the researcher has adopted two null hypotheses. 1) there is no statistically significant difference at the level of (0.05) between the average score of students of the experimental group who study the history of Europe and modern American history according to the model of Addie and Shayer, and the average scores of the students of the control group who study the same subjects according to the traditional method in the test of post-achievement. 2) There was no statistically significant difference at the level (
... Show MoreSuzanne Collins’ novel The Hunger Games suggests a new logic of victory and set a distinguished focus on the unique personality of her heroin which brings to the mind the permanent correlation between all moral values. The Hunger Games World seems to be much more like one big bowl as it links the past, present, and the future. An Intertextual reference is interwoven in the present research as it brings Golding’s Lord of the Flies to the surface, and it highlights certain similarities between the two texts. In which Ralph, Piggy and Simon in Golding’s Lord of the Flies are the incarnations of stable moral values and hope of surviving ethics and rules in a chaotic and turmoil world. The event
... Show MoreThe deep learning algorithm has recently achieved a lot of success, especially in the field of computer vision. This research aims to describe the classification method applied to the dataset of multiple types of images (Synthetic Aperture Radar (SAR) images and non-SAR images). In such a classification, transfer learning was used followed by fine-tuning methods. Besides, pre-trained architectures were used on the known image database ImageNet. The model VGG16 was indeed used as a feature extractor and a new classifier was trained based on extracted features.The input data mainly focused on the dataset consist of five classes including the SAR images class (houses) and the non-SAR images classes (Cats, Dogs, Horses, and Humans). The Conv
... Show MoreIn the present study, MCM-41 was synthesis as a carrier for poorly drugs soluble in water, by the sol-gel technique. Textural and chemical characterizations of MCM-41 were carried out by X-ray diffraction (XRD), Fourier transform infrared (FTIR), scanning electron microscope (SEM), and thermal gravimetric analysis (TGA). The experimental results were analyzed mesoporous carriers MCM-41. With maximum drug loading efficiency in MCM-41 determined to be 90.74%. The NYS released was prudently studied in simulated body fluid (SBF) pH 7.4 and the results proved that the release of NYS from MCM-41 was (87.79%) after 18 hr. The data of NYS released was found to be submitted a Weibull model with a correlation coefficient of (0.995). The Historical
... Show More