This research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear multiplicity between most explanatory variables. These new combinations of linear compounds resulting from the two methods will reduce the number of explanatory variables to reach a new dimension one or more which called the effective dimension. The mean root of the error squares will be used to compare the two methods to show the preference of methods and a simulation study was conducted to compare the methods used. Simulation results showed that the proposed weight standard Sir method is the best.
In some cases, researchers need to know the causal effect of the treatment in order to know the extent of the effect of the treatment on the sample in order to continue to give the treatment or stop the treatment because it is of no use. The local weighted least squares method was used to estimate the parameters of the fuzzy regression discontinuous model, and the local polynomial method was used to estimate the bandwidth. Data were generated with sample sizes (75,100,125,150 ) in repetition 1000. An experiment was conducted at the Innovation Institute for remedial lessons in 2021 for 72 students participating in the institute and data collection. Those who used the treatment had an increase in their score after
... Show MoreThe aesthetic contents of data visualization is one of the contemporary areas through which data scientists and designers have been able to link data to humans, and even after reaching successful attempts to model data visualization, it wasn't clear how that reveals how it contributed to choosing the aesthetic content as an input to humanize these models, so the goal of the current research is to use The analytical descriptive approach aims to identify the aesthetic contents in data visualization, which the researchers interpreted through pragmatic philosophy and Kantian philosophy, and analyze a sample of data visualization models to reveal the aesthetic entrances in them to explain how to humanize them. The two researchers reached seve
... Show MoreResearch includes three axes, the first is the average estimate time of achievement (day) to work oversight, to five supervisory departments in the Office of Financial Supervision Federal and then choose the three control outputs and at the level of each of the five departments above, and after analyzing the data statistically back to us that the distribution of the times of achievement It is the exponential distribution (Exponential Distribution) a parameter (q), and the distribution of normal (Normal Distribution) with two parameters (μ, σ2), and introduced four methods of parameter estimation (q) as well as four modalities parameter to estimate (
... Show MoreThe aim of this research is to estimate the parameters of the linear regression model with errors following ARFIMA model by using wavelet method depending on maximum likelihood and approaching general least square as well as ordinary least square. We use the estimators in practical application on real data, which were the monthly data of Inflation and Dollar exchange rate obtained from the (CSO) Central Statistical organization for the period from 1/2005 to 12/2015. The results proved that (WML) was the most reliable and efficient from the other estimators, also the results provide that the changing of fractional difference parameter (d) doesn’t effect on the results.
Infection with cryptosporidiosis endangers the lives of many people with immunodeficiency, especially HIV patients. Nitazoxanide is one of the main therapeutic drugs used to treat cryptosporidiosis. However, it is poorly soluble in water, which restricts its usefulness and efficacy in immunocompromised patients. Surfactants have an amphiphilic character which indicates their ability to improve the water solubility of the hydrophobic drugs. Our research concerns the synthesis of new cationic Gemini surfactants that have the ability to improve the solubility of the drug Nanazoxide. So, we synthesized cationic Gemini surfactants. N1,N1,N3,N3-tetramethyl-N1,N3-bis(2-octadecanamidoethyl)propane-1,3-diaminium bromide (CGSPS18) and 2,2‘-(etha
... Show MoreSecure storage of confidential medical information is critical to healthcare organizations seeking to protect patient's privacy and comply with regulatory requirements. This paper presents a new scheme for secure storage of medical data using Chaskey cryptography and blockchain technology. The system uses Chaskey encryption to ensure integrity and confidentiality of medical data, blockchain technology to provide a scalable and decentralized storage solution. The system also uses Bflow segmentation and vertical segmentation technologies to enhance scalability and manage the stored data. In addition, the system uses smart contracts to enforce access control policies and other security measures. The description of the system detailing and p
... Show MoreA new design of manifold flow injection (FI) coupling with a merging zone technique was studied for sulfamethoxazole determination spectrophotometrically. The semiautomated FI method has many advantages such as being fast, simple, highly accurate, economical with high throughput . The suggested method based on the production of the orange- colored compound of SMZ with (NQS)1,2-Naphthoquinone-4-Sulphonic acid Sodium salt in alkaline media NaOH at λmax 496nm.The linearity range of sulfamethoxazole was 3-100 μg. mL-1, with (LOD) was 0.593 μg. mL-1 and the RSD% is about 1.25 and the recovery is 100.73%. All various physical and chemical parameters that have an effect on the stability and development of
... Show MoreAbstract
The current research aims to reveal the extent to which all scoring rubrics data for the electronic work file conform to the partial estimation model according to the number of assumed dimensions. The study sample consisted of (356) female students. The study concluded that the list with the one-dimensional assumption is more appropriate than the multi-dimensional assumption, The current research recommends preparing unified correction rules for the different methods of performance evaluation in the basic courses. It also suggests the importance of conducting studies aimed at examining the appropriateness of different evaluation methods for models of response theory to the
... Show MoreAbstract:
This research aims to compare Bayesian Method and Full Maximum Likelihood to estimate hierarchical Poisson regression model.
The comparison was done by simulation using different sample sizes (n = 30, 60, 120) and different Frequencies (r = 1000, 5000) for the experiments as was the adoption of the Mean Square Error to compare the preference estimation methods and then choose the best way to appreciate model and concluded that hierarchical Poisson regression model that has been appreciated Full Maximum Likelihood Full Maximum Likelihood with sample size (n = 30) is the best to represent the maternal mortality data after it has been reliance value param
... Show MoreIn this study, an unknown force function dependent on the space in the wave equation is investigated. Numerically wave equation splitting in two parts, part one using the finite-difference method (FDM). Part two using separating variables method. This is the continuation and changing technique for solving inverse problem part in (1,2). Instead, the boundary element method (BEM) in (1,2), the finite-difference method (FDM) has applied. Boundary data are in the role of overdetermination data. The second part of the problem is inverse and ill-posed, since small errors in the extra boundary data cause errors in the force solution. Zeroth order of Tikhonov regularization, and several parameters of regularization are employed to decrease error
... Show More