The study presents the modification of the Broyden-Flecher-Goldfarb-Shanno (BFGS) update (H-Version) based on the determinant property of inverse of Hessian matrix (second derivative of the objective function), via updating of the vector s ( the difference between the next solution and the current solution), such that the determinant of the next inverse of Hessian matrix is equal to the determinant of the current inverse of Hessian matrix at every iteration. Moreover, the sequence of inverse of Hessian matrix generated by the method would never approach a near-singular matrix, such that the program would never break before the minimum value of the objective function is obtained. Moreover, the new modification of BFGS update (H-version) preserves the symmetric property and the positive definite property without any condition.
Akaike’s Information Criterion (AIC) is a popular method for estimation the number of sources impinging on an array of sensors, which is a problem of great interest in several applications. The performance of AIC degrades under low Signal-to-Noise Ratio (SNR). This paper is concerned with the development and application of quadrature mirror filters (QMF) for improving the performance of AIC. A new system is proposed to estimate the number of sources by applying AIC to the outputs of filter bank consisting quadrature mirror filters (QMF). The proposed system can estimate the number of sources under low signal-to-noise ratio (SNR).
Cipher security is becoming an important step when transmitting important information through networks. The algorithms of cryptography play major roles in providing security and avoiding hacker attacks. In this work two hybrid cryptosystems have been proposed, that combine a modification of the symmetric cryptosystem Playfair cipher called the modified Playfair cipher and two modifications of the asymmetric cryptosystem RSA called the square of RSA technique and the square RSA with Chinese remainder theorem technique. The proposed hybrid cryptosystems have two layers of encryption and decryption. In the first layer the plaintext is encrypted using modified Playfair to get the cipher text, this cipher text will be encrypted using squared
... Show MoreThe aim of this research is to construct a cognitive behavior program based on the theory of Meichenbaum in reducing the emotional sensitivity among Intermediate school students. To achieve the aims of the research, two hypotheses were formulated and the experimental design with equal groups was chosen. The population of research and its sample are determined. The test of negative emotional sensitivity, which is constructed by the researcher, was adopted. The test contains (20) items proved validity and reliability as a reliable test by presenting it to a group of arbitrators and experts in education and psychology. An educational program is constructed based on the theory of Meichenbaum. The test was applied to a sample of (60) second i
... Show More This research deals with the financial reporting for the non-current assets impairment from the viewpoint of international accounting standards, especially IAS 36 "Impairment of assets”. The research problem focused on the non-compliance with the requirements of IAS 36 which would negatively affect the accounting information quality, and its characteristics, especially the relevance of accounting information, that confirms the necessity of having such information for the three sub-characteristics in order to be useful for the decisions of users represented
Understanding the effects of fear, quadratic fixed effort harvesting, and predator-dependent refuge are essential topics in ecology. Accordingly, a modified Leslie–Gower prey–predator model incorporating these biological factors is mathematically modeled using the Beddington–DeAngelis type of functional response to describe the predation processes. The model’s qualitative features are investigated, including local equilibria stability, permanence, and global stability. Bifurcation analysis is carried out on the temporal model to identify local bifurcations such as transcritical, saddle-node, and Hopf bifurcation. A comprehensive numerical inquiry is carried out using MATLAB to verify the obtained theoretical findings and und
... Show MoreThis paper interest to estimation the unknown parameters for generalized Rayleigh distribution model based on censored samples of singly type one . In this paper the probability density function for generalized Rayleigh is defined with its properties . The maximum likelihood estimator method is used to derive the point estimation for all unknown parameters based on iterative method , as Newton – Raphson method , then derive confidence interval estimation which based on Fisher information matrix . Finally , testing whether the current model ( GRD ) fits to a set of real data , then compute the survival function and hazard function for this real data.
In this paper, the theoretical cross section in pre-equilibrium nuclear reaction has been studied for the reaction at energy 22.4 MeV. Ericson’s formula of partial level density PLD and their corrections (William’s correction and spin correction) have been substituted in the theoretical cross section and compared with the experimental data for nucleus. It has been found that the theoretical cross section with one-component PLD from Ericson’s formula when doesn’t agree with the experimental value and when . There is little agreement only at the high value of energy range with the experimental cross section. The theoretical cross section that depends on the one-component William's formula and on-component corrected to spi
... Show MoreDeepFake is a concern for celebrities and everyone because it is simple to create. DeepFake images, especially high-quality ones, are difficult to detect using people, local descriptors, and current approaches. On the other hand, video manipulation detection is more accessible than an image, which many state-of-the-art systems offer. Moreover, the detection of video manipulation depends entirely on its detection through images. Many worked on DeepFake detection in images, but they had complex mathematical calculations in preprocessing steps, and many limitations, including that the face must be in front, the eyes have to be open, and the mouth should be open with the appearance of teeth, etc. Also, the accuracy of their counterfeit detectio
... Show More