This research deals with a shrinking method concerned with the principal components similar to that one which used in the multiple regression “Least Absolute Shrinkage and Selection: LASS”. The goal here is to make an uncorrelated linear combinations from only a subset of explanatory variables that may have a multicollinearity problem instead taking the whole number say, (K) of them. This shrinkage will force some coefficients to equal zero, after making some restriction on them by some "tuning parameter" say, (t) which balances the bias and variance amount from side, and doesn't exceed the acceptable percent explained variance of these components. This had been shown by MSE criterion in the regression case and the percent explained variance in the principal component case.
Electronic Health Record (EHR) systems are used as an efficient and effective method of exchanging patients’ health information with doctors and other key stakeholders in the health sector to obtain improved patient treatment decisions and diagnoses. As a result, questions regarding the security of sensitive user data are highlighted. To encourage people to move their sensitive health records to cloud networks, a secure authentication and access control mechanism that protects users’ data should be established. Furthermore, authentication and access control schemes are essential in the protection of health data, as numerous responsibilities exist to ensure security and privacy in a network. So, the main goal of our s
... Show MoreIn this paper, we investigate the automatic recognition of emotion in text. We perform experiments with a new method of classification based on the PPM character-based text compression scheme. These experiments involve both coarse-grained classification (whether a text is emotional or not) and also fine-grained classification such as recognising Ekman’s six basic emotions (Anger, Disgust, Fear, Happiness, Sadness, Surprise). Experimental results with three datasets show that the new method significantly outperforms the traditional word-based text classification methods. The results show that the PPM compression based classification method is able to distinguish between emotional and nonemotional text with high accuracy, between texts invo
... Show MoreIn this article, we developed a new loss function, as the simplification of linear exponential loss function (LINEX) by weighting LINEX function. We derive a scale parameter, reliability and the hazard functions in accordance with upper record values of the Lomax distribution (LD). To study a small sample behavior performance of the proposed loss function using a Monte Carlo simulation, we make a comparison among maximum likelihood estimator, Bayesian estimator by means of LINEX loss function and Bayesian estimator using square error loss (SE) function. The consequences have shown that a modified method is the finest for valuing a scale parameter, reliability and hazard functions.
The Dagum Regression Model, introduced to address limitations in traditional econometric models, provides enhanced flexibility for analyzing data characterized by heavy tails and asymmetry, which is common in income and wealth distributions. This paper develops and applies the Dagum model, demonstrating its advantages over other distributions such as the Log-Normal and Gamma distributions. The model's parameters are estimated using Maximum Likelihood Estimation (MLE) and the Method of Moments (MoM). A simulation study evaluates both methods' performance across various sample sizes, showing that MoM tends to offer more robust and precise estimates, particularly in small samples. These findings provide valuable insights into the ana
... Show MoreIn this paper, we introduce three robust fuzzy estimators of a location parameter based on Buckley’s approach, in the presence of outliers. These estimates were compared using the variance of fuzzy numbers criterion, all these estimates were best of Buckley’s estimate. of these, the fuzzy median was the best in the case of small and medium sample size, and in large sample size, the fuzzy trimmed mean was the best.
Aleksandr Isayevich Solzhenitsyn was born in 1918 in Kislovodsk. His father was educated, and despite his peasantry origin, he got a university degree. Unfortunately, however, Solzhenitsyn could not recognize his father, for he died before his birth.
Solzhenitsyn was accused of being an opponent to the Soviet Union due to his activities of that time. He was exiled to a forced labour camp for eight years, on the surroundings of Moscow. He spent three years in Kazakhstan, and was sent to the life exile. He was set free in 1956. He worked as a teacher in in rural schools in Vladimir and then in Rezhran.
His greatest works were in the reign of Kherchov, and in 1962 appeared his story under the title of "One day in
... Show MoreFinancial inclusion refers to the access of financial services at low cost and high-quality from the formal financial sector to all segments of society, especially marginalized groups, and then use and benefit from them. Financial inclusion is also associated with banking stability, as well as with financial integrity and financial protection for the consumer, therefore, it achieves a number of objectives, the most important of which is to support and enhance banking stability. This is what made it attract the attention of many countries and central banks recently.
The study aims to show the impact of financial inclusion indicators on ban
... Show MoreDue to severe scouring, many bridges failed worldwide. Therefore, the safety of the existing bridge (after contrition) mainly depends on the continuous monitoring of local scour at the substructure. However, the bridge's safety before construction mainly depends on the consideration of local scour estimation at the bridge substructure. Estimating the local scour at the bridge piers is usually done using the available formulae. Almost all the formulae used in estimating local scour at the bridge piers were derived from laboratory data. It is essential to test the performance of proposed local scour formulae using field data. In this study, the performance of selected bridge scours estimation formulae was validated and sta
... Show MoreOne of the costliest problems facing the production of hydrocarbons in unconsolidated sandstone reservoirs is the production of sand once hydrocarbon production starts. The sanding start prediction model is very important to decide on sand control in the future, including whether or when sand control should be used. This research developed an easy-to-use Computer program to determine the beginning of sanding sites in the driven area. The model is based on estimating the critical pressure drop that occurs when sand is onset to produced. The outcomes have been drawn as a function of the free sand production with the critical flow rates for reservoir pressure decline. The results show that the pressure drawdown required to
... Show MoreIn this paper we present the theoretical foundation of forward error analysis of numerical algorithms under;• Approximations in "built-in" functions.• Rounding errors in arithmetic floating-point operations.• Perturbations of data.The error analysis is based on linearization method. The fundamental tools of the forward error analysis are system of linear absolute and relative a prior and a posteriori error equations and associated condition numbers constituting optimal of possible cumulative round – off errors. The condition numbers enable simple general, quantitative bounds definitions of numerical stability. The theoretical results have been applied a Gaussian elimination, and have proved to be very effective means of both a prior
... Show More