Random matrix theory is used to study the chaotic properties in nuclear energy spectrum of the 24Mg nucleus. The excitation energies (which are the main object of this study) are obtained via performing shell model calculations using the OXBASH computer code together with an effective interaction of Wildenthal (W) in the isospin formalism. The 24Mg nucleus is assumed to have an inert 16O core with 8 nucleons (4protons and 4neutrons) move in the 1d5/2, 2s1/2 and 1d3/2 orbitals. The spectral fluctuations are studied by two statistical measures: the nearest neighbor level spacing distribution
This study attempts to address the importance of communicative digitization in the field of various arts for the sake of continuity of shopping and aesthetic, artistic and intellectual appreciation of artistic achievements by the recipient on various places of their residence in light of the COVID 19 crisis, and to highlight the importance of the plastic arts of the Iraqi painter exclusively and how it expresses in a contemporary way the environment or life reality in Iraq in light of this crisis. With all its implications affecting the life reality from various aspects and methods of its negative and positive employment. As for the research procedures, the researcher reviewed the research methodology represented by the descriptive ana
... Show MoreThe aim of this article is to study the dynamical behavior of an eco-epidemiological model. A prey-predator model comprising infectious disease in prey species and stage structure in predator species is suggested and studied. Presumed that the prey species growing logistically in the absence of predator and the ferocity process happened by Lotka-Volterra functional response. The existence, uniqueness, and boundedness of the solution of the model are investigated. The stability constraints of all equilibrium points are determined. The constraints of persistence of the model are established. The local bifurcation near every equilibrium point is analyzed. The global dynamics of the model are investigated numerically and confronted with the obt
... Show MoreThis paper discusses reliability R of the (2+1) Cascade model of inverse Weibull distribution. Reliability is to be found when strength-stress distributed is inverse Weibull random variables with unknown scale parameter and known shape parameter. Six estimation methods (Maximum likelihood, Moment, Least Square, Weighted Least Square, Regression and Percentile) are used to estimate reliability. There is a comparison between six different estimation methods by the simulation study by MATLAB 2016, using two statistical criteria Mean square error and Mean Absolute Percentage Error, where it is found that best estimator between the six estimators is Maximum likelihood estimation method.
Abstract
The research aimed to test the relationship between the size of investment allocations in the agricultural sector in Iraq and their determinants using the Ordinary Least Squares (OLS) method compared to the Error Correction Model (ECM) approach. The time series data for the period from 1990 to 2021 was utilized. The analysis showed that the estimates obtained using the ECM were more accurate and significant than those obtained using the OLS method. Johansen's test indicated the presence of a long-term equilibrium relationship between the size of investment allocations and their determinants. The results of th
... Show MoreThe aim of the research is to measure the efficiency of the companies in the industrial sector listed in the Iraqi Stock Exchange , by directing these companies to their resources (inputs) towards achieving the greatest possible returns (outputs) or reduce those resources while maintaining the level of returns to achieve the efficiency of these companies, therefore, in order to achieve the objectives of the research, it was used (Demerjian.et.al) model to measure the efficiency of companies and the factors influencing them. The researchers had got a number of conclusions , in which the most important of them is that 66.6% of the companies in the research sample do not possess relatively high efficiency and that the combined factors (the nat
... Show MoreThe aim of the research is to measure the efficiency of the companies in the industrial sector listed in the Iraqi Stock Exchange , by directing these companies to their resources (inputs) towards achieving the greatest possible returns (outputs) or reduce those resources while maintaining the level of returns to achieve the efficiency of these companies, therefore, in order to achieve the objectives of the research, it was used (Demerjian.et.al) model to measure the efficiency of companies and the factors influencing them. The researchers had got a number of conclusions , in which the most important of them is that 66.6% of the companies in the research sample do no
... Show MoreVideo steganography has become a popular option for protecting secret data from hacking attempts and common attacks on the internet. However, when the whole video frame(s) are used to embed secret data, this may lead to visual distortion. This work is an attempt to hide sensitive secret image inside the moving objects in a video based on separating the object from the background of the frame, selecting and arranging them according to object's size for embedding secret image. The XOR technique is used with reverse bits between the secret image bits and the detected moving object bits for embedding. The proposed method provides more security and imperceptibility as the moving objects are used for embedding, so it is difficult to notice the
... Show MoreSoftware-Defined Networking (SDN) has evolved network management by detaching the control plane from the data forwarding plane, resulting in unparalleled flexibility and efficiency in network administration. However, the heterogeneity of traffic in SDN presents issues in achieving Quality of Service (QoS) demands and efficiently managing network resources. SDN traffic flows are often divided into elephant flows (EFs) and mice flows (MFs). EFs, which are distinguished by their huge packet sizes and long durations, account for a small amount of total traffic but require disproportionate network resources, thus causing congestion and delays for smaller MFs. MFs, on the other hand, have a short lifetime and are latency-sensitive, but they accou
... Show More