Hazardous materials, heavy metals, and organic toxins released into the environment have caused considerable harm to microbes, plants, animals, and humans. Wastewater is one of the most contaminated ecosystems due to heavy metals emitted mostly by human activity. Bioremediation of wastewater is an ecologically acceptable and cost-effective method of removing heavy metals from sewage; the general purpose of this study is to analyse the dependability of anaerobic sludge biomass in removing sulfur compounds and heavy metals from waste water. The anaerobic sludge biomass evaluated in this work was taken from a wastewater treatment plant (WWTP) in Al-Rustumiya, Baghdad, and grown in the mineral medium for anaerobic growth. In serum bottl
... Show MoreAbstract
The hydrometallurgical method was used to platinum and palladium leaching with aqua regia solution (3HCl: HNO3). The leaching experiments were designed to obtain the optimum conditions by using Taguchi method with 16 experiments at three different factors (time, temperature and solid to liquid ratio), and each factor has four different levels. In this study, leaching the powder sample of catalytic converter that contains platinum and palladium was conducted on the basis of the formation of chloro complexes platinum and palladium (PtCl62-, PdCl42-) with different concentrations in the acidic solution. The optimum condi
... Show MoreEvolutionary algorithms (EAs), as global search methods, are proved to be more robust than their counterpart local heuristics for detecting protein complexes in protein-protein interaction (PPI) networks. Typically, the source of robustness of these EAs comes from their components and parameters. These components are solution representation, selection, crossover, and mutation. Unfortunately, almost all EA based complex detection methods suggested in the literature were designed with only canonical or traditional components. Further, topological structure of the protein network is the main information that is used in the design of almost all such components. The main contribution of this paper is to formulate a more robust E
... Show MoreAnomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the
... Show MoreMost recent studies have focused on using modern intelligent techniques spatially, such as those
developed in the Intruder Detection Module (IDS). Such techniques have been built based on modern
artificial intelligence-based modules. Those modules act like a human brain. Thus, they should have had the
ability to learn and recognize what they had learned. The importance of developing such systems came after
the requests of customers and establishments to preserve their properties and avoid intruders’ damage. This
would be provided by an intelligent module that ensures the correct alarm. Thus, an interior visual intruder
detection module depending on Multi-Connect Architecture Associative Memory (MCA)