Evolutionary algorithms are better than heuristic algorithms at finding protein complexes in protein-protein interaction networks (PPINs). Many of these algorithms depend on their standard frameworks, which are based on topology. Further, many of these algorithms have been exclusively examined on networks with only reliable interaction data. The main objective of this paper is to extend the design of the canonical and topological-based evolutionary algorithms suggested in the literature to cope with noisy PPINs. The design of the evolutionary algorithm is extended based on the functional domain of the proteins rather than on the topological domain of the PPIN. The gene ontology annotation in each molecular function, biological process, and cellular component is used to get the functional domain. The reliability of the proposed algorithm is examined against the algorithms proposed in the literature. To this end, a yeast protein-protein interaction dataset is used in the assessment of the final quality of the algorithms. To make fake negative controls of PPIs that are wrongly informed and are linked to the high-throughput interaction data, different noisy PPINs are created. The noisy PPINs are synthesized with a different and increasing percentage of misinformed PPIs. The results confirm the effectiveness of the extended evolutionary algorithm design to utilize the biological knowledge of the gene ontology. Feeding EA design with GO annotation data improves reliability and produces more accurate detection results than the counterpart algorithms.
According to the prevalence of multidrug resistance bacteria, especially Pseudomonas aeruginosa, in which the essential mechanism of drug resistance is the ability to possess an efflux pump by which extrusion of antimicrobial agents usually occurs, this study aims to detect the presence of mexB multidrug efflux gene in some local isolates of this bacteria that show resistance towards three antibiotics, out of five. Sensitivity test to antibiotics was performed on all isolates by using meropenem (10μg/disc), imipenem (10μg/disc), amikacin (30 μg/disc), ciprofloxacin (5μg/disc) and ceftazidime (30 μg/disc). Conventional PCR results showed the presence of mexB gene (244bp) in four isolates out of ten (40%). In addition,25, 50μg/ml of cur
... Show MoreIntroduction The abortions reasons in several circumstances yet are mysterious, nevertheless the bacterial toxicities signify a main reason in abortion, where germs seems to be the utmost elaborate pathogens (Khameneh et.al., 2014) and (Oliver and Overton ,2014). Between numerous germs, Humano
Listeria spp. is one of the abortion causative agents in animals, especially in ruminants. This work aimed to detect Listeria spp. in milk and aborted fetus cows in Iraq. A total of 50 organ samples from aborted cow fetuses, including (brain, liver, and spleen), and 50 milk samples from the same aborted cows were collected from Baghdad farms, Iraq from (October 2023- March 2024). The bacteria were identified by conventional culture methods, biochemical tests, and the VITEK2 compact system, followed by molecular confirmation. The antimicrobial resistance pattern assay was performed using the disc diffusion method against eight antibiotic agents, and the L.monocytogenes virulence genes involving prfA,actA, and hylA genes were detected using t
... Show MoreIn this study, the possible protective effects of daidzein on ifosfamide-induced neurotoxicity in male rats were examined by the determination of changes in selected oxidant–antioxidant markers of male rats’ brain tissue.
Twenty-eight (28) apparently-healthy Wistar male rats weighing (120-150gm) allocated into 4 groups (n=7) were used in this study. Rats orally-administered 1% tween 20 dissolved in distilled water/Control (Group I); rats were orally-administered daidzein suspension (100mg/kg) for 7 days (Group II); rats intraperitoneally-injected with a single dose of ifosfamide (500 mg/kg) (Group III); rats orally-administered for 7 days with the daidzein (100mg/
... Show MoreInternational companies are striving to reduce their costs and increase their profits, and these trends have produced many methods and techniques to achieve these goals. these methods is heuristic and the other Optimization.. The research includes an attempt to adapt some of these techniques in the Iraqi companies, and these techniques are to determine the optimal lot size using the algorithms Wagner-Whitin under the theory of constraints. The research adopted the case study methodology to objectively identify the problem of research, namely determining lot size optimal for each of the products of electronic measurement laboratory in Diyala and in light of the bottlenecks in w
... Show MoreA three-stage learning algorithm for deep multilayer perceptron (DMLP) with effective weight initialisation based on sparse auto-encoder is proposed in this paper, which aims to overcome difficulties in training deep neural networks with limited training data in high-dimensional feature space. At the first stage, unsupervised learning is adopted using sparse auto-encoder to obtain the initial weights of the feature extraction layers of the DMLP. At the second stage, error back-propagation is used to train the DMLP by fixing the weights obtained at the first stage for its feature extraction layers. At the third stage, all the weights of the DMLP obtained at the second stage are refined by error back-propagation. Network structures an
... Show MoreThe problem of Bi-level programming is to reduce or maximize the function of the target by having another target function within the constraints. This problem has received a great deal of attention in the programming community due to the proliferation of applications and the use of evolutionary algorithms in addressing this kind of problem. Two non-linear bi-level programming methods are used in this paper. The goal is to achieve the optimal solution through the simulation method using the Monte Carlo method using different small and large sample sizes. The research reached the Branch Bound algorithm was preferred in solving the problem of non-linear two-level programming this is because the results were better.
Plagiarism is becoming more of a problem in academics. It’s made worse by the ease with which a wide range of resources can be found on the internet, as well as the ease with which they can be copied and pasted. It is academic theft since the perpetrator has ”taken” and presented the work of others as his or her own. Manual detection of plagiarism by a human being is difficult, imprecise, and time-consuming because it is difficult for anyone to compare their work to current data. Plagiarism is a big problem in higher education, and it can happen on any topic. Plagiarism detection has been studied in many scientific articles, and methods for recognition have been created utilizing the Plagiarism analysis, Authorship identification, and
... Show More