In the lifetime process in some systems, most data cannot belong to one single population. In fact, it can represent several subpopulations. In such a case, the known distribution cannot be used to model data. Instead, a mixture of distribution is used to modulate the data and classify them into several subgroups. The mixture of Rayleigh distribution is best to be used with the lifetime process. This paper aims to infer model parameters by the expectation-maximization (EM) algorithm through the maximum likelihood function. The technique is applied to simulated data by following several scenarios. The accuracy of estimation has been examined by the average mean square error (AMSE) and the average classification success rate (ACSR). The results showed that the method performed well in all simulation scenarios with respect to different sample sizes.
The purpose of this study is to develop and assess the effectiveness of exercises using heavy and hanging ropes for handball players, focusing on enhancing specific physical abilities and shooting accuracy. The research addresses the gap in training methodologies by comparing the effects of heavy rope exercises versus hanging rope exercises. An experimental design was used in two equal groups, besides pre-testing and post-testing. The study involved 16 players from the School of Handball for the season 2022–2023. The sample included 14 players, who were then randomly divided into two experimental groups of 7 each. The first group performed heavy rope exercises, while hanging rope exercises were included in the plan of the second group. Th
... Show MoreThis study proposed using color components as artificial intelligence (AI) input to predict milk moisture and fat contents. In this sense, an adaptive neuro‐fuzzy inference system (ANFIS) was applied to milk processed by moderate electrical field‐based non‐thermal (NP) and conventional pasteurization (CP). The differences between predicted and experimental data were not significant (
Research on the automated extraction of essential data from an electrocardiography (ECG) recording has been a significant topic for a long time. The main focus of digital processing processes is to measure fiducial points that determine the beginning and end of the P, QRS, and T waves based on their waveform properties. The presence of unavoidable noise during ECG data collection and inherent physiological differences among individuals make it challenging to accurately identify these reference points, resulting in suboptimal performance. This is done through several primary stages that rely on the idea of preliminary processing of the ECG electrical signal through a set of steps (preparing raw data and converting them into files tha
... Show Moren this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the types of the kernel boundary func
... Show MoreThis research deals with a shrinking method concernes with the principal components similar to that one which used in the multiple regression “Least Absolute Shrinkage and Selection: LASS”. The goal here is to make an uncorrelated linear combinations from only a subset of explanatory variables that may have a multicollinearity problem instead taking the whole number say, (K) of them. This shrinkage will force some coefficients to equal zero, after making some restriction on them by some "tuning parameter" say, (t) which balances the bias and variance amount from side, and doesn't exceed the acceptable percent explained variance of these components. This had been shown by MSE criterion in the regression case and the percent explained v
... Show MoreThe survival analysis is one of the modern methods of analysis that is based on the fact that the dependent variable represents time until the event concerned in the study. There are many survival models that deal with the impact of explanatory factors on the likelihood of survival, including the models proposed by the world, David Cox, one of the most important and common models of survival, where it consists of two functions, one of which is a parametric function that does not depend on the survival time and the other a nonparametric function that depends on times of survival, which the Cox model is defined as a semi parametric model, The set of parametric models that depend on the time-to-event distribution parameters such as
... Show MoreThe article reflects the results of the analysis of the use of metaphors when creating the image of the main character of the story by D. Rubina "You and me under the peach clouds" - a pet, a dog named Kondraty. Through metaphorization, the image of the dog is filled by the author with purely human qualities, thus passing into the category of a full member of the family. The article is a continuation of the study of the work of D. I. Rubina.