A new modified differential evolution algorithm DE-BEA, is proposed to improve the reliability of the standard DE/current-to-rand/1/bin by implementing a new mutation scheme inspired by the bacterial evolutionary algorithm (BEA). The crossover and the selection schemes of the DE method are also modified to fit the new DE-BEA mechanism. The new scheme diversifies the population by applying to all the individuals a segment based scheme that generates multiple copies (clones) from each individual one-by-one and applies the BEA segment-wise mechanism. These new steps are embedded in the DE/current-to-rand/bin scheme. The performance of the new algorithm has been compared with several DE variants over eighteen benchmark functions including several CEC 2005 test problems and it shows reliability in most of the test cases.
Background Bloodstream infection (BSI) is a life-threatening condition caused by the presence of microorganisms, generally caused by a range of bacteria in the blood. Objectives The aim of this study was to evaluate the possible role of procalcitonin (PCT) and C-reactive protein (CRP) as biomarkers of pediatric BSI. Methodology The study was conducted on 150 blood samples collected from the patient who admitted to Children Welfare Teaching Hospital, Medical City, Baghdad. During the period from November 2020 to March 2021, ninety blood samples from them were positive culture and 60 blood samples were negative culture (control group). The isolates were identified depending on the morphological, microscopic examination, and biochemical tests.
... Show MoreThe permeability estimates for the uncored wells and a porosity function adopting a modified flow zone index-permeability crossplot are given in this work. The issues with implementing that approach were mostly crossplots, due to the influence of geological heterogeneity, did not show a clear connection (scatter data). Carbonate reservoir flow units may now be identified and characterized using a new approach, which has been formally confirmed. Due to the comparable distribution and flow of clastic and carbonate rock fluids, this zoning method is most effective for reservoirs with significant primary and secondary porosity. The equations and correlations here are more generalizable since they connect these variables by combining cor
... Show MoreVariable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage
... Show MoreThis paper aims at analyzing Terry Bisson’s short story Bears Discover Fire stylistically by following both Gerard Genette’s theory of narratology (1980) and Short and Leech (1981) strategy for analyzing fictional works. Also trying to examine to what extent these models are applicable in analyzing the selected story. Stylistic analysis procedures help the readers/researchers to identify specific linguistic features in order to support literary interpretation and appreciation of literary texts. Style in fiction concentrates not on what is written, but on how a text is written. Each writer has his own style and techniques which distinguish him from other writers
The use of composite materials has vastly increased in recent years. Great interest is therefore developed in the damage detection of composites using non- destructive test methods. Several approaches have been applied to obtain information about the existence and location of the faults. This paper used the vibration response of a composite plate to detect and localize delamination defect based on the modal analysis. Experiments are conducted to validate the developed model. A two-dimensional finite element model for multi-layered composites with internal delamination is established. FEM program are built for plates under different boundary conditions. Natural frequencies and modal displacements of the intact and damaged
... Show MoreThis work aims to analyze a three-dimensional discrete-time biological system, a prey-predator model with a constant harvesting amount. The stage structure lies in the predator species. This analysis is done by finding all possible equilibria and investigating their stability. In order to get an optimal harvesting strategy, we suppose that harvesting is to be a non-constant rate. Finally, numerical simulations are given to confirm the outcome of mathematical analysis.
Wellbore instability is a significant problem faced during drilling operations and causes loss of circulation, caving, stuck pipe, and well kick or blowout. These problems take extra time to treat and increase the Nonproductive Time (NPT). This paper aims to review the factors that influence the stability of wellbores and know the methods that have been reached to reduce them. Based on a current survey, the factors that affect the stability of the wellbore are far-field stress, rock mechanical properties, natural fractures, pore pressure, wellbore trajectory, drilling fluid chemicals, mobile formations, naturally over-pressured shale collapse, mud weight, temperature, and time. Also, the most suitable ways to reduce well
... Show MoreThere are many methods of searching large amount of data to find one particular piece of information. Such as find name of person in record of mobile. Certain methods of organizing data make the search process more efficient the objective of these methods is to find the element with least cost (least time). Binary search algorithm is faster than sequential and other commonly used search algorithms. This research develops binary search algorithm by using new structure called Triple, structure in this structure data are represented as triple. It consists of three locations (1-Top, 2-Left, and 3-Right) Binary search algorithm divide the search interval in half, this process makes the maximum number of comparisons (Average case com
... Show MoreIn this study, an analysis of re-using the JPEG lossy algorithm on the quality of satellite imagery is presented. The standard JPEG compression algorithm is adopted and applied using Irfan view program, the rang of JPEG quality that used is 50-100.Depending on the calculated satellite image quality variation, the maximum number of the re-use of the JPEG lossy algorithm adopted in this study is 50 times. The image quality degradation to the JPEG quality factor and the number of re-use of the JPEG algorithm to store the satellite image is analyzed.
The objective of this work is to study the influence of end milling cutting process parameters, tool material and geometry on multi-response outputs for 4032 Al-alloy. This can be done by proposing an approach that combines Taguchi method with grey relational analysis. Three cutting parameters have been selected (spindle speed, feed rate and cut depth) with three levels for each parameter. Three tools with different materials and geometry have been also used to design the experimental tests and runs based on matrix L9. The end milling process with several output characteristics is solved using a grey relational analysis. The results of analysis of variance (ANOVA) showed that the major influencing parameters on multi-objective response w
... Show More