<p>In combinatorial testing development, the fabrication of covering arrays is the key challenge by the multiple aspects that influence it. A wide range of combinatorial problems can be solved using metaheuristic and greedy techniques. Combining the greedy technique utilizing a metaheuristic search technique like hill climbing (HC), can produce feasible results for combinatorial tests. Methods based on metaheuristics are used to deal with tuples that may be left after redundancy using greedy strategies; then the result utilization is assured to be near-optimal using a metaheuristic algorithm. As a result, the use of both greedy and HC algorithms in a single test generation system is a good candidate if constructed correctly. This study presents a hybrid greedy hill climbing algorithm (HGHC) that ensures both effectiveness and near-optimal results for generating a small number of test data. To make certain that the suggested HGHC outperforms the most used techniques in terms of test size. It is compared to others in order to determine its effectiveness. In contrast to recent practices utilized for the production of covering arrays (CAs) and mixed covering arrays (MCAs), this hybrid strategy is superior since allowing it to provide the utmost outcome while reducing the size and limit the loss of unique pairings in the CA/MCA generation.</p>
Eichhornia, or water hyacinth represents a serious threat to potable water basins. This problem is materialized majorly in consuming large amounts of water and dissolved Oxygen that is necessary for aquatic life, and minorly in hindering water streams. Even there are limited trials to overcome such pests, none of them presents an acceptable solution economically and logically. Chlorine is a well-known biocide and broadly used in water industry. It could give a possible method to fight such weed. To investigate that, concentration-time plot should be introduced similar to any other microorganisms; especially, bacteria in water. In this work, various doses of Chlorine along various time
Abstract
In this work, two algorithms of Metaheuristic algorithms were hybridized. The first is Invasive Weed Optimization algorithm (IWO) it is a numerical stochastic optimization algorithm and the second is Whale Optimization Algorithm (WOA) it is an algorithm based on the intelligence of swarms and community intelligence. Invasive Weed Optimization Algorithm (IWO) is an algorithm inspired by nature and specifically from the colonizing weeds behavior of weeds, first proposed in 2006 by Mehrabian and Lucas. Due to their strength and adaptability, weeds pose a serious threat to cultivated plants, making them a threat to the cultivation process. The behavior of these weeds has been simulated and used in Invas
... Show MoreNowadays, it is quite usual to transmit data through the internet, making safe online communication essential and transmitting data over internet channels requires maintaining its confidentiality and ensuring the integrity of the transmitted data from unauthorized individuals. The two most common techniques for supplying security are cryptography and steganography. Data is converted from a readable format into an unreadable one using cryptography. Steganography is the technique of hiding sensitive information in digital media including image, audio, and video. In our proposed system, both encryption and hiding techniques will be utilized. This study presents encryption using the S-DES algorithm, which generates a new key in each cyc
... Show MoreHoneywords are fake passwords that serve as an accompaniment to the real password, which is called a “sugarword.” The honeyword system is an effective password cracking detection system designed to easily detect password cracking in order to improve the security of hashed passwords. For every user, the password file of the honeyword system will have one real hashed password accompanied by numerous fake hashed passwords. If an intruder steals the password file from the system and successfully cracks the passwords while attempting to log in to users’ accounts, the honeyword system will detect this attempt through the honeychecker. A honeychecker is an auxiliary server that distinguishes the real password from the fake passwords and t
... Show MoreIn this paper, the botnet detection problem is defined as a feature selection problem and the genetic algorithm (GA) is used to search for the best significant combination of features from the entire search space of set of features. Furthermore, the Decision Tree (DT) classifier is used as an objective function to direct the ability of the proposed GA to locate the combination of features that can correctly classify the activities into normal traffics and botnet attacks. Two datasets namely the UNSW-NB15 and the Canadian Institute for Cybersecurity Intrusion Detection System 2017 (CICIDS2017), are used as evaluation datasets. The results reveal that the proposed DT-aware GA can effectively find the relevant features from
... Show MoreAbstract. This work presents a detailed design of a three-jointed tendon-driven robot finger with a cam/pulleys transmission and joint Variable Stiffness Actuator (VSA). The finger motion configuration is obtained by deriving the cam/pulleys transmission profile as a mathematical solution that is then implemented to achieve contact force isotropy on the phalanges. A VSA is proposed, in which three VSAs are designed to act as a muscle in joint space to provide firm grasping. As a mechatronic approach, a suitable type and number of force sensors and actuators are designed to sense the touch, actuate the finger, and tune the VSAs. The torque of the VSAs is controlled utilizing a designed Multi Input Multi Output (MIMO) fuzzy controll
... Show MoreChoosing antimicrobials is a common dilemma when the expected rate of bacterial resistance is high. The observed resistance values in unequal groups of isolates tested for different antimicrobials can be misleading. This can affect the decision to recommend one antibiotic over the other. We analyzed recalled data with the statistical consideration of unequal sample groups. Data was collected concerning children suspected to have typhoid fever at Al Alwyia Pediatric Teaching Hospital in Baghdad, Iraq. The study period extended from September 2021 to September 2022. A novel algorithm was developed to compare the drug sensitivity among unequal numbers of Salmonella typhi (S. Typhi) isolates tested with different antibacterials.
... Show MoreAnomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the
... Show MoreThe present study aimed to investigate the possible production of Thioflavin T and the effect of NaCl concentrations and growth phases on the growth rate, doubling time and proline of C. saipanensis N. Hanagata (Scenedesmaceae, Shaerophleales). The alga was cultured in BG 11 medium and six NaCl concentrations were used in the experiments during different growth phases. The results have unveiled the presence of Triflavin T in the alga. The study results showed a growth rate decrease at all NaCl concentrations except in control treatment, while the doubling time, was recorded highest value (14 days) at the NaCl concentration of 0.08 M. The highest value of Proline (0.509 mg. Lˉ¹) was recorded at the treatment of 0.08 M of NaCl and recorded
... Show MoreThis research work as an attempt has been made to find robust estimations for Hotelling-T2 test when the data is from amultivariate normal distribution and the sample of the multivariate contain outliers also this research gives an easily high breakdown point robust consistent estimators of multivariate location and dispersion for multivariate analysis by using two types of robust estimators, of these methods are minimum covariance determinant estimator and reweighted minimum covariance determinant estimator.