In this paper, RBF-based multistage auto-encoders are used to detect IDS attacks. RBF has numerous applications in various actual life settings. The planned technique involves a two-part multistage auto-encoder and RBF. The multistage auto-encoder is applied to select top and sensitive features from input data. The selected features from the multistage auto-encoder is wired as input to the RBF and the RBF is trained to categorize the input data into two labels: attack or no attack. The experiment was realized using MATLAB2018 on a dataset comprising 175,341 case, each of which involves 42 features and is authenticated using 82,332 case. The developed approach here has been applied for the first time, to the knowledge of the authors, to detect IDS attacks with 98.80% accuracy when validated using UNSW-NB15 dataset. The experimental results show the proposed method presents satisfactory results when compared with those obtained in this field.
Cold plasma is a relatively low temperature gas, so this feature enables us to use cold plasma to treat thermally sensitive materials including polymers and biologic tissues. In this research, the non-thermal plasma system is designed with diameter (3 mm, 10 mm) Argon at atmospheric pressure as well as to be suitable for use in medical and biotechnological applications.
The thermal description of this system was studied and we observed the effect of the diameter of the plasma needle on the plasma, when the plasma needle slot is increased the plasma temperature decrease, as well as the effect of the voltages applied to the temperature of the plasma, where the temperature increasing with increasing the applied voltage . Results showed t
Suffer most of the facilities of the high cost of inventory , which affects the high cost of the product and thus affects many administrative decisions , as well as suffer the facilities of the systems developed by the provisions of inventory control , and this problem is exacerbated in the construction sector where the inventory in the form of Construction spin of the Year for another it becomes difficult to control the cost effectively , and is the research problem in question follows: What are the implications of the use of the system in time inventory accounting system for the contracting company does kills Alrkaah to the provisions of the cost of inventory and what is the optimal approach to inventory control ? Find assumed
... Show MoreVarious speech enhancement Algorithms (SEA) have been developed in the last few decades. Each algorithm has its advantages and disadvantages because the speech signal is affected by environmental situations. Distortion of speech results in the loss of important features that make this signal challenging to understand. SEA aims to improve the intelligibility and quality of speech that different types of noise have degraded. In most applications, quality improvement is highly desirable as it can reduce listener fatigue, especially when the listener is exposed to high noise levels for extended periods (e.g., manufacturing). SEA reduces or suppresses the background noise to some degree, sometimes called noise suppression alg
... Show MoreType 2 daibetes mellitus (T2DM) is a global concern boosted by both population growth and ageing, the majority of affected people are aged between (40- 59 year). The objective of this research was to estimate the impact of age and gender on glycaemic control parameters: Fasting blood glucose (FBC), glycated hemoglobin (HbA1C), insulin, insulin resistance (IR) and insulin sensitivity (IS), renal function parameters: urea, creatinine and oxidative stress parameters: total antioxidant capacity (TAC) and reactive oxygen species (ROS). Eighty-one random samples of T2DM patients (35 men and 46 women) were included in this study, their average age was 52.75±9.63 year. Current study found that FBG, HbA1C and IR were highly significant (P<0.01) inc
... Show MoreIn regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement
... Show MoreAccelerates operating managements in the facilities contemporary business environment toward redefining processes and strategies that you need to perform tasks of guaranteeing them continue in an environment performance dominated by economic globalization and the circumstances of uncertainty attempt the creation of a new structure through multiple pages seek to improve profitability and sustainable growth in performance in a climatefocuses on the development of institutional processes, reduce costs and achieve customer satisfaction to meet their demands and expectations are constantly changing. The research was presented structural matrix performance combines methodology Alsigma in order to improve customer satisfaction significantly bet
... Show MoreIn this work a fragile watermarking scheme is presented. This scheme is applied to digital color images in spatial domain. The image is divided into blocks, and each block has its authentication mark embedded in it, we would be able to insure which parts of the image are authentic and which parts have been modified. This authentication carries out without need to exist the original image. The results show the quality of the watermarked image is remaining very good and the watermark survived some type of unintended modification such as familiar compression software like WINRAR and ZIP
Although text document images authentication is difficult due to the binary nature and clear separation between the background and foreground but it is getting higher demand for many applications. Most previous researches in this field depend on insertion watermark in the document, the drawback in these techniques lie in the fact that changing pixel values in a binary document could introduce irregularities that are very visually noticeable. In this paper, a new method is proposed for object-based text document authentication, in which I propose a different approach where a text document is signed by shifting individual words slightly left or right from their original positions to make the center of gravity for each line fall in with the m
... Show MoreThis article studies a comprehensive methods of edge detection and algorithms in digital images which is reflected a basic process in the field of image processing and analysis. The purpose of edge detection technique is discovering the borders that distinct diverse areas of an image, which donates to refining the understanding of the image contents and extracting structural information. The article starts by clarifying the idea of an edge and its importance in image analysis and studying the most noticeable edge detection methods utilized in this field, (e.g. Sobel, Prewitt, and Canny filters), besides other schemes based on distinguishing unexpected modifications in light intensity and color gradation. The research as well discuss
... Show MoreTo achieve safe security to transfer data from the sender to receiver, cryptography is one way that is used for such purposes. However, to increase the level of data security, DNA as a new term was introduced to cryptography. The DNA can be easily used to store and transfer the data, and it becomes an effective procedure for such aims and used to implement the computation. A new cryptography system is proposed, consisting of two phases: the encryption phase and the decryption phase. The encryption phase includes six steps, starting by converting plaintext to their equivalent ASCII values and converting them to binary values. After that, the binary values are converted to DNA characters and then converted to their equivalent complementary DN
... Show More