Evolutionary algorithms (EAs), as global search methods, are proved to be more robust than their counterpart local heuristics for detecting protein complexes in protein-protein interaction (PPI) networks. Typically, the source of robustness of these EAs comes from their components and parameters. These components are solution representation, selection, crossover, and mutation. Unfortunately, almost all EA based complex detection methods suggested in the literature were designed with only canonical or traditional components. Further, topological structure of the protein network is the main information that is used in the design of almost all such components. The main contribution of this paper is to formulate a more robust EA with more biological consistency. For this purpose, a new crossover operator is suggested where biological information in terms of both gene semantic similarity and protein functional similarity is fed into its design. To reflect the heuristic roles of both semantic and functional similarities, this paper introduces two gene ontology (GO) aware crossover operators. These are direct annotation-aware and inherited annotation-aware crossover operators. The first strategy is handled with the direct gene ontology annotation of the proteins, while the second strategy is handled with the directed acyclic graph (DAG) of each gene ontology term in the gene product. To conduct our experiments, the proposed EAs with GO-aware crossover operators are compared against the state-of-the-art heuristic, canonical EAs with the traditional crossover operator, and GO-based EAs. Simulation results are evaluated in terms of recall, precision, and F measure at both complex level and protein level. The results prove that the new EA design encourages a more reliable treatment of exploration and exploitation and, thus, improves the detection ability for more accurate protein complex structures.
Reverse Osmosis (RO) has already proved its worth as an efficient treatment method in chemical and environmental engineering applications. Various successful RO attempts for the rejection of organic and highly toxic pollutants from wastewater can be found in the literature over the last decade. Dimethylphenol is classified as a high-toxic organic compound found ubiquitously in wastewater. It poses a real threat to humans and the environment even at low concentration. In this paper, a model based framework was developed for the simulation and optimisation of RO process for the removal of dimethylphenol from wastewater. We incorporated our earlier developed and validated process model into the Species Conserving Genetic Algorithm (SCG
... Show MoreCadmium has been known to be harmful to human healthy , manily Via contaminated drinking water , food supplies , tobacco and industrial pollutant . The aim of this study was to determine the toxicity of new Cadmium (II) complex ( Bis[ 5- ( P- nitrophenyl ) – ? 4 – Phenyl- 1,2,4- triazole -3- dithiocarbamatohydrazide] cadmium (II) Hydra ( 0.5) and compare it with anticancer drug cyclophosphamide ( CP) in female albino mice . This complex causes to several alterations in Enzymatic activity of Glutamate Pyruvate Transaminase (GPT) and Alkaline Phosphatase (ALP ) in three organs after the treatment of mice with different doses of a new cadmium (II) complex ( 0.09 / 0.25ml , 0.18/ 0.5ml and 0.25mg /0.7 ml /30 gm of mous
... Show MoreThis paper deals with an analytical study of the flow of an incompressible generalized Burgers’ fluid (GBF) in an annular pipe. We discussed in this problem the flow induced by an impulsive pressure gradient and compare the results with flow due to a constant pressure gradient. Analytic solutions for velocity is earned by using discrete Laplace transform (DLT) of the sequential fractional derivatives (FD) and finite Hankel transform (FHT). The influences of different parameters are analyzed on a velocity distribution characteristics and a comparison between two cases is also presented, and discussed in details. Eventually, the figures are plotted to exhibit these effects.
Recently, Image enhancement techniques can be represented as one of the most significant topics in the field of digital image processing. The basic problem in the enhancement method is how to remove noise or improve digital image details. In the current research a method for digital image de-noising and its detail sharpening/highlighted was proposed. The proposed approach uses fuzzy logic technique to process each pixel inside entire image, and then take the decision if it is noisy or need more processing for highlighting. This issue is performed by examining the degree of association with neighboring elements based on fuzzy algorithm. The proposed de-noising approach was evaluated by some standard images after corrupting them with impulse
... Show MoreHoneywords are fake passwords that serve as an accompaniment to the real password, which is called a “sugarword.” The honeyword system is an effective password cracking detection system designed to easily detect password cracking in order to improve the security of hashed passwords. For every user, the password file of the honeyword system will have one real hashed password accompanied by numerous fake hashed passwords. If an intruder steals the password file from the system and successfully cracks the passwords while attempting to log in to users’ accounts, the honeyword system will detect this attempt through the honeychecker. A honeychecker is an auxiliary server that distinguishes the real password from the fake passwords and t
... Show MoreThis work aims to develop a secure lightweight cipher algorithm for constrained devices. A secure communication among constrained devices is a critical issue during the data transmission from the client to the server devices. Lightweight cipher algorithms are defined as a secure solution for constrained devices that require low computational functions and small memory. In contrast, most lightweight algorithms suffer from the trade-off between complexity and speed in order to produce robust cipher algorithm. The PRESENT cipher has been successfully experimented on as a lightweight cryptography algorithm, which transcends other ciphers in terms of its computational processing that required low complexity operations. The mathematical model of
... Show MoreNowadays, it is quite usual to transmit data through the internet, making safe online communication essential and transmitting data over internet channels requires maintaining its confidentiality and ensuring the integrity of the transmitted data from unauthorized individuals. The two most common techniques for supplying security are cryptography and steganography. Data is converted from a readable format into an unreadable one using cryptography. Steganography is the technique of hiding sensitive information in digital media including image, audio, and video. In our proposed system, both encryption and hiding techniques will be utilized. This study presents encryption using the S-DES algorithm, which generates a new key in each cyc
... Show MoreSurvival analysis is the analysis of data that are in the form of times from the origin of time until the occurrence of the end event, and in medical research, the origin of time is the date of registration of the individual or the patient in a study such as clinical trials to compare two types of medicine or more if the endpoint It is the death of the patient or the disappearance of the individual. The data resulting from this process is called survival times. But if the end is not death, the resulting data is called time data until the event. That is, survival analysis is one of the statistical steps and procedures for analyzing data when the adopted variable is time to event and time. It could be d
... Show MoreA new blind restoration algorithm is presented and shows high quality restoration. This
is done by enforcing Wiener filtering approach in the Fourier domains of the image and the
psf environments