String matching is seen as one of the essential problems in computer science. A variety of computer applications provide the string matching service for their end users. The remarkable boost in the number of data that is created and kept by modern computational devices influences researchers to obtain even more powerful methods for coping with this problem. In this research, the Quick Search string matching algorithm are adopted to be implemented under the multi-core environment using OpenMP directive which can be employed to reduce the overall execution time of the program. English text, Proteins and DNA data types are utilized to examine the effect of parallelization and implementation of Quick Search string matching algorithm on multi-core based environment. Experimental outcomes reveal that the overall performance of the mentioned string matching algorithm has been improved, and the improvement in the execution time which has been obtained is considerable enough to recommend the multi-core environment as the suitable platform for parallelizing the Quick Search string matching algorithm.
Effective decision-making process is the basis for successfully solving any engineering problem. Many decisions taken in the construction projects differ in their nature due to the complex nature of the construction projects. One of the most crucial decisions that might result in numerous issues over the course of a construction project is the selection of the contractor. This study aims to use the ordinal priority approach (OPA) for the contractor selection process in the construction industry. The proposed model involves two computer programs; the first of these will be used to evaluate the decision-makers/experts in the construction projects, while the second will be used to formul
The high and low water levels in Tigris River threaten the banks of the river. The study area is located on the main stream of Tigris River at Nu’maniyah City and the length of the considered reach is 5.4 km, especially the region from 400 m upstream Nu’maniyah Bridge and downstream of the bridge up to 1250 mwhich increased the risk ofthe problemthat itheading towardsthe streetand causingdanger tonearbyareas.
The aim of this research is to identify the reason of slope collapse and find proper treatments for erosion problem in the river banks with the least cost. The modeling approach consisted of several steps, the first of which is by using “mini” JET (Jet Erosion Test) d
... Show MoreOften times, especially in practical applications, it is difficult to obtain data that is not tainted by a problem that may be related to the inconsistency of the variance of error or any other problem that impedes the use of the usual methods represented by the method of the ordinary least squares (OLS), To find the capabilities of the features of the multiple linear models, This is why many statisticians resort to the use of estimates by immune methods Especially with the presence of outliers, as well as the problem of error Variance instability, Two methods of horsepower were adopted, they are the robust weighted least square(RWLS)& the two-step robust weighted least square method(TSRWLS), and their performance was verifie
... Show MoreReverse Osmosis (RO) has already proved its worth as an efficient treatment method in chemical and environmental engineering applications. Various successful RO attempts for the rejection of organic and highly toxic pollutants from wastewater can be found in the literature over the last decade. Dimethylphenol is classified as a high-toxic organic compound found ubiquitously in wastewater. It poses a real threat to humans and the environment even at low concentration. In this paper, a model based framework was developed for the simulation and optimisation of RO process for the removal of dimethylphenol from wastewater. We incorporated our earlier developed and validated process model into the Species Conserving Genetic Algorithm (SCG
... Show MoreLongitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show MoreNowadays, it is quite usual to transmit data through the internet, making safe online communication essential and transmitting data over internet channels requires maintaining its confidentiality and ensuring the integrity of the transmitted data from unauthorized individuals. The two most common techniques for supplying security are cryptography and steganography. Data is converted from a readable format into an unreadable one using cryptography. Steganography is the technique of hiding sensitive information in digital media including image, audio, and video. In our proposed system, both encryption and hiding techniques will be utilized. This study presents encryption using the S-DES algorithm, which generates a new key in each cyc
... Show MoreIn the current digitalized world, cloud computing becomes a feasible solution for the virtualization of cloud computing resources. Though cloud computing has many advantages to outsourcing an organization’s information, but the strong security is the main aspect of cloud computing. Identity authentication theft becomes a vital part of the protection of cloud computing data. In this process, the intruders violate the security protocols and perform attacks on the organizations or user’s data. The situation of cloud data disclosure leads to the cloud user feeling insecure while using the cloud platform. The different traditional cryptographic techniques are not able to stop such kinds of attacks. BB84 protocol is the first quantum cry
... Show MoreRecently, Image enhancement techniques can be represented as one of the most significant topics in the field of digital image processing. The basic problem in the enhancement method is how to remove noise or improve digital image details. In the current research a method for digital image de-noising and its detail sharpening/highlighted was proposed. The proposed approach uses fuzzy logic technique to process each pixel inside entire image, and then take the decision if it is noisy or need more processing for highlighting. This issue is performed by examining the degree of association with neighboring elements based on fuzzy algorithm. The proposed de-noising approach was evaluated by some standard images after corrupting them with impulse
... Show MoreThis work aims to develop a secure lightweight cipher algorithm for constrained devices. A secure communication among constrained devices is a critical issue during the data transmission from the client to the server devices. Lightweight cipher algorithms are defined as a secure solution for constrained devices that require low computational functions and small memory. In contrast, most lightweight algorithms suffer from the trade-off between complexity and speed in order to produce robust cipher algorithm. The PRESENT cipher has been successfully experimented on as a lightweight cryptography algorithm, which transcends other ciphers in terms of its computational processing that required low complexity operations. The mathematical model of
... Show More