One of the most difficult issues in the history of communication technology is the transmission of secure images. On the internet, photos are used and shared by millions of individuals for both private and business reasons. Utilizing encryption methods to change the original image into an unintelligible or scrambled version is one way to achieve safe image transfer over the network. Cryptographic approaches based on chaotic logistic theory provide several new and promising options for developing secure Image encryption methods. The main aim of this paper is to build a secure system for encrypting gray and color images. The proposed system consists of two stages, the first stage is the encryption process, in which the keys are generated depending on the chaotic logistic with the image density to encrypt the gray and color images, and the second stage is the decryption, which is the opposite of the encryption process to obtain the original image. The proposed method has been tested on two standard gray and color images publicly available. The test results indicate to the highest value of peak signal-to-noise ratio (PSNR), unified average changing intensity (UACI), number of pixel change rate (NPCR) are 7.7268, 50.2011 and 100, respectively. While the encryption and decryption speed up to 0.6319 and 0.5305 second respectively.
The current research aimed to analyze the importance, correlation and the effect of independent variables represented by marketing variables on the dependent variable represented by local brand, through taking ENIEM as a model for this study, which represents a sensitive sector for the Algerian consumer. The results of the study evinced that the Algerian consumer has a positive image toward the brand ENIEM given marketing variables which has acquired considerable importance to this consumer. Also, the results of this study showed a statistically significant correlation between marketing variables and good perception toward the brand ENIEM, at the same time, the existence of a statistically significant effect for each of these variables o
... Show MoreThis study compared in vitro the microleakage of a new low shrink silorane-based posterior composite (Filtek™ P90) and two methacrylate-based composites: a packable posterior composite (Filtek™ P60) and a nanofill composite (Filtek™ Supreme XT) through dye penetration test. Thirty sound human upper premolars were used in this study. Standardized class V cavities were prepared at the buccal surface of each tooth. The teeth were then divided into three groups of ten teeth each: (Group 1: restored with Filtek™ P90, Group 2: restored with Filtek™ P60, and Group 3: restored with Filtek™ Supreme XT). Each composite system was used according to the manufacturer's instructions with their corresponding adhesive systems. The teeth were th
... Show MoreWith the increasing integration of computers and smartphones into our daily lives, in addition to the numerous benefits it offers over traditional paper-based methods of conducting affairs, it has become necessary to incorporate one of the most essential facilities into this integration; namely: colleges. The traditional approach for conducting affairs in colleges is mostly paper-based, which only increases time and workload and is relatively decentralized. This project provides educational and management services for the university environment, targeting the staff, the student body, and the lecturers, on two of the most used platforms: smartphones and reliable web applications by clo
Among the metaheuristic algorithms, population-based algorithms are an explorative search algorithm superior to the local search algorithm in terms of exploring the search space to find globally optimal solutions. However, the primary downside of such algorithms is their low exploitative capability, which prevents the expansion of the search space neighborhood for more optimal solutions. The firefly algorithm (FA) is a population-based algorithm that has been widely used in clustering problems. However, FA is limited in terms of its premature convergence when no neighborhood search strategies are employed to improve the quality of clustering solutions in the neighborhood region and exploring the global regions in the search space. On the
... Show More
An automatic text summarization system mimics how humans summarize by picking the most significant sentences in a source text. However, the complexities of the Arabic language have become challenging to obtain information quickly and effectively. The main disadvantage of the traditional approaches is that they are strictly constrained (especially for the Arabic language) by the accuracy of sentence feature functions, weighting schemes, and similarity calculations. On the other hand, the meta-heuristic search approaches have a feature tha
... Show MoreA multidimensional systolic arrays realization of LMS algorithm by a method of mapping regular algorithm onto processor array, are designed. They are based on appropriately selected 1-D systolic array filter that depends on the inner product sum systolic implementation. Various arrays may be derived that exhibit a regular arrangement of the cells (processors) and local interconnection pattern, which are important for VLSI implementation. It reduces latency time and increases the throughput rate in comparison to classical 1-D systolic arrays. The 3-D multilayered array consists of 2-D layers, which are connected with each other only by edges. Such arrays for LMS-based adaptive (FIR) filter may be opposed the fundamental requirements of fa
... Show MoreSignal denoising is directly related to sample estimation of received signals, either by estimating the equation parameters for the target reflections or the surrounding noise and clutter accompanying the data of interest. Radar signals recorded using analogue or digital devices are not immune to noise. Random or white noise with no coherency is mainly produced in the form of random electrons, and caused by heat, environment, and stray circuitry loses. These factors influence the output signal voltage, thus creating detectable noise. Differential Evolution (DE) is an effectual, competent, and robust optimisation method used to solve different problems in the engineering and scientific domains, such as in signal processing. This paper looks
... Show More