In this paper, a new method of selection variables is presented to select some essential variables from large datasets. The new model is a modified version of the Elastic Net model. The modified Elastic Net variable selection model has been summarized in an algorithm. It is applied for Leukemia dataset that has 3051 variables (genes) and 72 samples. In reality, working with this kind of dataset is not accessible due to its large size. The modified model is compared to some standard variable selection methods. Perfect classification is achieved by applying the modified Elastic Net model because it has the best performance. All the calculations that have been done for this paper are in R program by using some existing packages.
Electronic properties such as density of state, energy gap, HOMO (the highest occupied molecular orbital) level, LUMO (the lowest unoccupied molecular orbital) level and density of bonds, as well as spectroscopic properties like infrared (IR), Raman scattering, force constant, and reduced masses for coronene C24, reduced graphene oxide (rGO) C24O5and interaction between C24O5and NO2gas molecules were investigated. Density functional theory (DFT) with the exchange hybrid function B3LYP with 6-311G** basis sets through the Gaussian 09 W software program was used to do these calculations. Gaussian view 05 was em
... Show MoreThe study involved the removal of acidity from free fatty acid via the esterification reaction of oleic acid with ethanol. The reaction was done in a batch reactor using commercial 13X zeolite as a catalyst. The effects of temperatures (40 to 70 °C) and reaction time (up to 120 minutes) were studied using 6:1 mole ratio of pure ethanol to oleic acid and 5 wt. % of the catalyst. The results showed that acid removed increased with increasing temperature and reaction time. Also, the acidity removal rises sharply during the first reaction period and then changes slightly afterward. The highest acidity removal value was 67 % recorded at 110 minutes and 70 °C. An apparent homogeneous reversible reaction kinetic model has been proposed a
... Show MoreAcute Respiratory Distress Syndrome (ARDS) is triggered by a variety of insults, such as bacterial and viral infections, including SARS-CoV-2, leading to high mortality. In the murine model of ARDS induced by Staphylococcal enterotoxin-B (SEB), our previous studies showed that while SEB triggered 100% mortality, treatment with Resveratrol (RES) completely prevented such mortality by attenuating inflammation in the lungs. In the current study, we investigated the metabolic profile of SEB-activated immune cells in the lungs following treatment with RES. RES-treated mice had higher expression of miR-100 in the lung mononuclear cells (MNCs), which targeted mTOR, leading to its decreased expression. Also, Single-cell RNA-seq (scRNA seq)
... Show MoreThe aim of this essay is to use a single-index model in developing and adjusting Fama-MacBeth. Penalized smoothing spline regression technique (SIMPLS) foresaw this adjustment. Two generalized cross-validation techniques, Generalized Cross Validation Grid (GGCV) and Generalized Cross Validation Fast (FGCV), anticipated the regular value of smoothing covered under this technique. Due to the two-steps nature of the Fama-MacBeth model, this estimation generated four estimates: SIMPLS(FGCV) - SIMPLS(FGCV), SIMPLS(FGCV) - SIM PLS(GGCV), SIMPLS(GGCV) - SIMPLS(FGCV), SIM PLS(GGCV) - SIM PLS(GGCV). Three-factor Fama-French model—market risk premium, size factor, value factor, and their implication for excess stock returns and portfolio return
... Show MoreAbstract
The problem of missing data represents a major obstacle before researchers in the process of data analysis in different fields since , this problem is a recurrent one in all fields of study including social , medical , astronomical and clinical experiments .
The presence of such a problem within the data to be studied may influence negatively on the analysis and it may lead to misleading conclusions , together with the fact that these conclusions that result from a great bias caused by that problem in spite of the efficiency of wavelet methods but they are also affected by the missing of data , in addition to the impact of the problem of miss of accuracy estimation
... Show MoreThis research includes the study of dual data models with mixed random parameters, which contain two types of parameters, the first is random and the other is fixed. For the random parameter, it is obtained as a result of differences in the marginal tendencies of the cross sections, and for the fixed parameter, it is obtained as a result of differences in fixed limits, and random errors for each section. Accidental bearing the characteristic of heterogeneity of variance in addition to the presence of serial correlation of the first degree, and the main objective in this research is the use of efficient methods commensurate with the paired data in the case of small samples, and to achieve this goal, the feasible general least squa
... Show MoreThis research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear
... Show MoreAbstract
In this research provide theoretical aspects of one of the most important statistical distributions which it is Lomax, which has many applications in several areas, set of estimation methods was used(MLE,LSE,GWPM) and compare with (RRE) estimation method ,in order to find out best estimation method set of simulation experiment (36) with many replications in order to get mean square error and used it to make compare , simulation experiment contrast with (estimation method, sample size ,value of location and shape parameter) results show that estimation method effected by simulation experiment factors and ability of using other estimation methods such as(Shrinkage, jackknif
... Show MoreThis research is concerned with the re-analysis of optical data (the imaginary part of the dielectric function as a function of photon energy E) of a-Si:H films prepared by Jackson et al. and Ferlauto et al. through using nonlinear regression fitting we estimated the optical energy gap and the deviation from the Tauc model by considering the parameter of energy photon-dependence of the momentum matrix element of the p as a free parameter by assuming that density of states distribution to be a square root function. It is observed for films prepared by Jackson et al. that the value of the parameter p for the photon energy range is is close to the value assumed by the Cody model and the optical gap energy is which is also close to the value
... Show MoreRecently, the development and application of the hydrological models based on Geographical Information System (GIS) has increased around the world. One of the most important applications of GIS is mapping the Curve Number (CN) of a catchment. In this research, three softwares, such as an ArcView GIS 9.3 with ArcInfo, Arc Hydro Tool and Geospatial Hydrologic Modeling Extension (Hec-GeoHMS) model for ArcView GIS 9.3, were used to calculate CN of (19210 ha) Salt Creek watershed (SC) which is located in Osage County, Oklahoma, USA. Multi layers were combined and examined using the Environmental Systems Research Institute (ESRI) ArcMap 2009. These layers are soil layer (Soil Survey Geographic SSURGO), 30 m x 30 m resolution of Digital Elevati
... Show More