The expanding use of multi-processor supercomputers has made a significant impact on the speed and size of many problems. The adaptation of standard Message Passing Interface protocol (MPI) has enabled programmers to write portable and efficient codes across a wide variety of parallel architectures. Sorting is one of the most common operations performed by a computer. Because sorted data are easier to manipulate than randomly ordered data, many algorithms require sorted data. Sorting is of additional importance to parallel computing because of its close relation to the task of routing data among processes, which is an essential part of many parallel algorithms. In this paper, sequential sorting algorithms, the parallel implementation of many sorting methods in a variety of ways using MPICH.NT.1.2.3 library under C++ programming language and comparisons between the parallel and sequential implementations are presented. Then, these methods are used in the image processing field. It have been built a median filter based on these submitted algorithms. As the parallel platform is unavailable, the time is computed in terms of a number of computations steps and communications steps
This paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with
... Show MoreAdministrative procedures in various organizations produce numerous crucial records and data. These
records and data are also used in other processes like customer relationship management and accounting
operations.It is incredibly challenging to use and extract valuable and meaningful information from these data
and records because they are frequently enormous and continuously growing in size and complexity.Data
mining is the act of sorting through large data sets to find patterns and relationships that might aid in the data
analysis process of resolving business issues. Using data mining techniques, enterprises can forecast future
trends and make better business decisions.The Apriori algorithm has bee
The city of Karbala is one of the most important holy places for visitors and pilgrims from the Islamic faith, especially through the Arabian visit, when crowds of millions gather to commemorate the martyrdom of Imam Hussein. Offering services and medical treatments during this time is very important, especially when the crowds head to their destination (the holy shrine of Imam Hussein (a.s)). In recent years, the Arba'in visit has witnessed an obvious growth in the number of participants. The biggest challenge is the health risks, and the preventive measures for both organizers and visitors. Researchers identified various challenges and factors to facilitating the Arba'in visit. The purpose of this research is to deal with the religious an
... Show MoreAbstract
The aim of this work is to create a power control system for wind turbines based on fuzzy logic. Three power control loop was considered including: changing the pitch angle of the blade, changing the length of the blade and turning the nacelle. The stochastic law was given for changes and instant inaccurate assessment of wind conditions changes. Two different algorithms were used for fuzzy inference in the control loop, the Mamdani and Larsen algorithms. These two different algorithms are materialized and developed in this study in Matlab-Fuzzy logic toolbox which has been practically implemented using necessary intelligent control system in electrical engineerin
... Show MoreBy definition, the detection of protein complexes that form protein-protein interaction networks (PPINs) is an NP-hard problem. Evolutionary algorithms (EAs), as global search methods, are proven in the literature to be more successful than greedy methods in detecting protein complexes. However, the design of most of these EA-based approaches relies on the topological information of the proteins in the PPIN. Biological information, as a key resource for molecular profiles, on the other hand, acquired a little interest in the design of the components in these EA-based methods. The main aim of this paper is to redesign two operators in the EA based on the functional domain rather than the graph topological domain. The perturb
... Show MoreBy optimizing the efficiency of a modular simulation model of the PV module structure by genetic algorithm, under several weather conditions, as a portion of recognizing the ideal plan of a Near Zero Energy Household (NZEH), an ideal life cycle cost can be performed. The optimum design from combinations of NZEH-variable designs, are construction positioning, window-to-wall proportion, and glazing categories, which will help maximize the energy created by photovoltaic panels. Comprehensive simulation technique and modeling are utilized in the solar module I-V and for P-V output power. Both of them are constructed on the famous five-parameter model. In addition, the efficiency of the PV panel is established by the genetic algorithm
... Show MoreRecurrent strokes can be devastating, often resulting in severe disability or death. However, nearly 90% of the causes of recurrent stroke are modifiable, which means recurrent strokes can be averted by controlling risk factors, which are mainly behavioral and metabolic in nature. Thus, it shows that from the previous works that recurrent stroke prediction model could help in minimizing the possibility of getting recurrent stroke. Previous works have shown promising results in predicting first-time stroke cases with machine learning approaches. However, there are limited works on recurrent stroke prediction using machine learning methods. Hence, this work is proposed to perform an empirical analysis and to investigate machine learning al
... Show MoreEco-friendly concrete is produced using the waste of many industries. It reduces the fears concerning energy utilization, raw materials, and mass-produced cost of common concrete. Several stress-strain models documented in the literature can be utilized to estimate the ultimate strength of concrete components reinforced with fibers. Unfortunately, there is a lack of data on how non-metallic fibers, such as polypropylene (PP), affect the properties of concrete, especially eco-friendly concrete. This study presents a novel approach to modeling the stress-strain behavior of eco-friendly polypropylene fiber-reinforced concrete (PFRC) using meta-heuristic particle swarm optimization (PSO) employing 26 PFRC various mixtures. The cement was partia
... Show MoreCryptography algorithms play a critical role in information technology against various attacks witnessed in the digital era. Many studies and algorithms are done to achieve security issues for information systems. The high complexity of computational operations characterizes the traditional cryptography algorithms. On the other hand, lightweight algorithms are the way to solve most of the security issues that encounter applying traditional cryptography in constrained devices. However, a symmetric cipher is widely applied for ensuring the security of data communication in constraint devices. In this study, we proposed a hybrid algorithm based on two cryptography algorithms PRESENT and Salsa20. Also, a 2D logistic map of a chaotic system is a
... Show More