The aim of this essay is to use a single-index model in developing and adjusting Fama-MacBeth. Penalized smoothing spline regression technique (SIMPLS) foresaw this adjustment. Two generalized cross-validation techniques, Generalized Cross Validation Grid (GGCV) and Generalized Cross Validation Fast (FGCV), anticipated the regular value of smoothing covered under this technique. Due to the two-steps nature of the Fama-MacBeth model, this estimation generated four estimates: SIMPLS(FGCV) - SIMPLS(FGCV), SIMPLS(FGCV) - SIM PLS(GGCV), SIMPLS(GGCV) - SIMPLS(FGCV), SIM PLS(GGCV) - SIM PLS(GGCV). Three-factor Fama-French model—market risk premium, size factor, value factor, and their implication for excess stock returns and portfolio returns—were estimated on the Iraqi Stock Exchange using the modified Fama-MacBeth. SIMPLS(FGCV)-GGCV performed best based on the findings. Results also revealed the statistical significance of the three factors of the Fama-French model, which enhanced the explanatory power of the model in terms of the performance of Iraq Stock Exchange
In this article, we developed a new loss function, as the simplification of linear exponential loss function (LINEX) by weighting LINEX function. We derive a scale parameter, reliability and the hazard functions in accordance with upper record values of the Lomax distribution (LD). To study a small sample behavior performance of the proposed loss function using a Monte Carlo simulation, we make a comparison among maximum likelihood estimator, Bayesian estimator by means of LINEX loss function and Bayesian estimator using square error loss (SE) function. The consequences have shown that a modified method is the finest for valuing a scale parameter, reliability and hazard functions.
Nuclear structure of 20,22Ne isotopes has been studied via the shell model with Skyrme-Hartree-Fock calculations. In particular, the transitions to the low-lying positive and negative parity excited states have been investigated within three shell model spaces; sd for positive parity states, spsdpf large-basis (no-core), and zbme model spaces for negative parity states. Excitation energies, reduced transition probabilities, and elastic and inelastic form factors were estimated and compared to the available experimental data. Skyrme interaction was used to generate a one-body potential in the Hartree-Fock calculations for each selected excited states, which is then used to calculate the single-particle matrix elements. Skyrme interac
... Show MoreCryptography is a major concern in communication systems. IoE technology is a new trend of smart systems based on various constrained devices. Lightweight cryptographic algorithms are mainly solved the most security concern of constrained devices and IoE systems. On the other hand, most lightweight algorithms are suffering from the trade-off between complexity and performance. Moreover, the strength of the cryptosystems, including the speed of the algorithm and the complexity of the system against the cryptanalysis. A chaotic system is based on nonlinear dynamic equations that are sensitive to initial conditions and produce high randomness which is a good choice for cryptosystems. In this work, we proposed a new five-dimensional of a chaoti
... Show MoreThe aim of this research is to find out the influence of Daniel's model on the skills of the twenty-first century among the students of the scientific-fifth grade at the secondary and preparatory government morning schools for the academic year 2022- 2023. Two groups were chosen out of five groups for the fifth-scientific grade, one of which represents the experimental group that is taught by the Daniel model, and the other is the control group that is taught in the traditional method. The equivalence of the two research groups was verified with a set of variables. As for the research tool, a scale was developed by the researchers for the skills of the twenty-first century, in which they adopted the framework of the Partnership Organizat
... Show MoreTrue random number generators are essential components for communications to be conconfidentially secured. In this paper a new method is proposed to generate random sequences of numbers based on the difference of the arrival times of photons detected in a coincidence window between two single-photon counting modules
The cheif aim of the present investigation is to develop Leslie Gower type three species food chain model with prey refuge. The intra-specific competition among the predators is considered in the proposed model. Besides the logistic growth rate for the prey species, Sokol Howell functional response for predation is chosen for our model formulation. The behaviour of the model system thoroughly analyses near the biologically significant equilibria. The linear stability analysis of the equilibria is carried out in order to examine the response of the system. The present model system experiences Hopf bifurcation depending on the choice of suitable model parameters. Extensive numerical simulation reveals the validity of the proposed model.
The aim of this article is to study the dynamical behavior of an eco-epidemiological model. A prey-predator model comprising infectious disease in prey species and stage structure in predator species is suggested and studied. Presumed that the prey species growing logistically in the absence of predator and the ferocity process happened by Lotka-Volterra functional response. The existence, uniqueness, and boundedness of the solution of the model are investigated. The stability constraints of all equilibrium points are determined. The constraints of persistence of the model are established. The local bifurcation near every equilibrium point is analyzed. The global dynamics of the model are investigated numerically and confronted with the obt
... Show MoreIn many scientific fields, Bayesian models are commonly used in recent research. This research presents a new Bayesian model for estimating parameters and forecasting using the Gibbs sampler algorithm. Posterior distributions are generated using the inverse gamma distribution and the multivariate normal distribution as prior distributions. The new method was used to investigate and summaries Bayesian statistics' posterior distribution. The theory and derivation of the posterior distribution are explained in detail in this paper. The proposed approach is applied to three simulation datasets of 100, 300, and 500 sample sizes. Also, the procedure was extended to the real dataset called the rock intensity dataset. The actual dataset is collecte
... Show More