Recent research has shown that a Deoxyribonucleic Acid (DNA) has ability to be used to discover diseases in human body as its function can be used for an intrusion-detection system (IDS) to detect attacks against computer system and networks traffics. Three main factor influenced the accuracy of IDS based on DNA sequence, which is DNA encoding method, STR keys and classification method to classify the correctness of proposed method. The pioneer idea on attempt a DNA sequence for intrusion detection system is using a normal signature sequence with alignment threshold value, later used DNA encoding based cryptography, however the detection rate result is very low. Since the network traffic consists of 41 attributes, therefore we proposed the most possible less character number (same DNA length) which is four-character DNA encoding that represented all 41 attributes known as DEM4all. The experiments conducted using standard data KDDCup 99 and NSL-KDD. Teiresias algorithm is used to extract Short Tandem Repeat (STR), which includes both keys and their positions in the network traffic, while Brute-force algorithm is used as a classification process to determine whether the network traffic is attack or normal. Experiment run 30 times for each DNA encoding method. The experiment result shows that proposed method has performed better accuracy (15% improved) compare with previous and state of the art DNA algorithms. With such results it can be concluded that the proposed DEM4all DNA encoding method is a good method that can used for IDS. More complex encoding can be proposed that able reducing less number of DNA sequence can possible produce more detection accuracy.
The search aims to clarify pollution to negative effects on environment and to an increasing in the dangerous polluted materials that discharged out these factories. To make active procedures in order to limit the environmental pollution.
The search problem came from an assumption which has the researched factory is suffering from the lack of applying the international specification ( ISO 14004 ). The research problem assimilated by these questions:
- What is the level or organization in thinking of environmental system according to ISO 14004 .
- What are the requirements used in researched factor
This research deals with increasing the hardening and insulating the petroleum pipes against the conditions and erosion of different environments. So, basic material of epoxy has been mixed with Ceramic Nano Zirconia reinforcement material 35 nm with the percentages (0,1,2,3,4,5) %, whereas the paint basis of broken petroleum pipes was used to paint on it, then it was cut into dimensions (2 cm. × 2 cm.) and 0.3cm high. After the paint and percentages are completed, the samples were immersed into the paint. Then, the micro-hardness was checked according to Vickers method and thermal inspection of paint, which contained (Thermal conduction, thermal flux and Thermal diffusivity), the density of the painted samples was calculate
... Show MoreThe aim of this paper is to design a PID controller based on an on-line tuning bat optimization algorithm for the step-down DC/DC buck converter system which is used in the battery operation of the mobile applications. In this paper, the bat optimization algorithm has been utilized to obtain the optimal parameters of the PID controller as a simple and fast on-line tuning technique to get the best control action for the system. The simulation results using (Matlab Package) show the robustness and the effectiveness of the proposed control system in terms of obtaining a suitable voltage control action as a smooth and unsaturated state of the buck converter input voltage of ( ) volt that will stabilize the buck converter sys
... Show MoreIn order to select the optimal tracking of fast time variation of multipath fast time variation Rayleigh fading channel, this paper focuses on the recursive least-squares (RLS) and Extended recursive least-squares (E-RLS) algorithms and reaches the conclusion that E-RLS is more feasible according to the comparison output of the simulation program from tracking performance and mean square error over five fast time variation of Rayleigh fading channels and more than one time (send/receive) reach to 100 times to make sure from efficiency of these algorithms.
Abstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
In this paper, an exact stiffness matrix and fixed-end load vector for nonprismatic beams having parabolic varying depth are derived. The principle of strain energy is used in the derivation of the stiffness matrix.
The effect of both shear deformation and the coupling between axial force and the bending moment are considered in the derivation of stiffness matrix. The fixed-end load vector for elements under uniformly distributed or concentrated loads is also derived. The correctness of the derived matrices is verified by numerical examples. It is found that the coupling effect between axial force and bending moment is significant for elements having axial end restraint. It was found that the decrease in bending moment was
in the
Cloud Computing is a mass platform to serve high volume data from multi-devices and numerous technologies. Cloud tenants have a high demand to access their data faster without any disruptions. Therefore, cloud providers are struggling to ensure every individual data is secured and always accessible. Hence, an appropriate replication strategy capable of selecting essential data is required in cloud replication environments as the solution. This paper proposed a Crucial File Selection Strategy (CFSS) to address poor response time in a cloud replication environment. A cloud simulator called CloudSim is used to conduct the necessary experiments, and results are presented to evidence the enhancement on replication performance. The obtained an
... Show MoreThe evolution of the Internet of things (IoT) led to connect billions of heterogeneous physical devices together to improve the quality of human life by collecting data from their environment. However, there is a need to store huge data in big storage and high computational capabilities. Cloud computing can be used to store big data. The data of IoT devices is transferred using two types of protocols: Message Queuing Telemetry Transport (MQTT) and Hypertext Transfer Protocol (HTTP). This paper aims to make a high performance and more reliable system through efficient use of resources. Thus, load balancing in cloud computing is used to dynamically distribute the workload across nodes to avoid overloading any individual r
... Show MoreDealing with this study to find a link principle in the construction of the system and disorder to the promise of the principles and concepts of intellectual and philosophical form and content in the act of interior design of the halls education in the Ministry of Labor and Social Affairs , which leads to a reaction design objective and functionally and to increase the strength of the effect on users of the coaches and trainers . As it informs the side cognitive and developmental study of the design of the interior spaces . So it was the goal of research in the detection of the nature of the system and disorder with Focus indicators and the type of regulations and privacy and what constitutes the framework of knowledge in order to explai
... Show More