Tor (The Onion Routing) network was designed to enable users to browse the Internet anonymously. It is known for its anonymity and privacy security feature against many agents who desire to observe the area of users or chase users’ browsing conventions. This anonymity stems from the encryption and decryption of Tor traffic. That is, the client’s traffic should be subject to encryption and decryption before the sending and receiving process, which leads to delay and even interruption in data flow. The exchange of cryptographic keys between network devices plays a pivotal and critical role in facilitating secure communication and ensuring the integrity of cryptographic procedures. This essential process is time-consuming, which causes delay and discontinuity of data flow. To overcome delay or interruption problems, we utilized the Software-Defined Network (SDN), Machine Learning (ML), and Blockchain (BC) techniques, which support the Tor network to intelligently speed up exchanging the public key via the proactive processing of the Tor network security management information. Consequently, the combination network (ITor-SDN) keeps data flow continuity to a Tor client. We simulated and emulated the proposed network by using Mininet and Shadow simulations. The findings of the performed analysis illustrate that the proposed network architecture enhances the overall performance metrics, showcasing a remarkable advancement of around 55%. This substantial enhancement is achieved through the seamless execution of the innovative ITor-SDN network combination approach.
The region-based association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease. Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls. To tackle the problem of the sparse distribution, a two-stage approach was proposed in literature: In the first stage, haplotypes are computationally inferred from genotypes, followed by a haplotype coclassification. In the second stage, the association analysis is performed on the inferred haplotype groups. If a haplotype is unevenly distributed between the case and control samples, this haplotype is labeled
... Show MoreMeasuring the efficiency of postgraduate and undergraduate programs is one of the essential elements in educational process. In this study, colleges of Baghdad University and data for the academic year (2011-2012) have been chosen to measure the relative efficiencies of postgraduate and undergraduate programs in terms of their inputs and outputs. A relevant method to conduct the analysis of this data is Data Envelopment Analysis (DEA). The effect of academic staff to the number of enrolled and alumni students to the postgraduate and undergraduate programs are the main focus of the study.
In this paper, we discuss physical layer security techniques in downlink networks, including eavesdroppers. The main objective of using physical layer security is delivering a perfectly secure message from a transmitter to an intended receiver in the presence of passive or active eavesdroppers who are trying to wiretap the information or disturb the network stability. In downlink networks, based on the random feature of channels to terminals, opportunistic user scheduling can be exploited as an additional tool for enhancing physical layer security. We introduce user scheduling strategies and discuss the corresponding performances according to different levels of channel state information (CSI) at the base station (BS). We show that the avai
... Show MoreWith the spread of global markets for modern technical education and the diversity of programs for the requirements of the local and global market for information and communication technology, the universities began to race among themselves to earn their academic reputation. In addition, they want to enhance their technological development by developing IMT systems with integrated technology as the security and fastest response with the speed of providing the required service and sure information and linking it The network and using social networking programs with wireless networks which in turn is a driver of the emerging economies of technical education. All of these facilities opened the way to expand the number of students and s
... Show MoreUnconfined Compressive Strength is considered the most important parameter of rock strength properties affecting the rock failure criteria. Various research have developed rock strength for specific lithology to estimate high-accuracy value without a core. Previous analyses did not account for the formation's numerous lithologies and interbedded layers. The main aim of the present study is to select the suitable correlation to predict the UCS for hole depth of formation without separating the lithology. Furthermore, the second aim is to detect an adequate input parameter among set wireline to determine the UCS by using data of three wells along ten formations (Tanuma, Khasib, Mishrif, Rumaila, Ahmady, Maudud, Nahr Um
... Show MorePrecision is one of the main elements that control the quality of a geodetic network, which defines as the measure of the network efficiency in propagation of random errors. This research aims to solve ZOD and FOD problems for a geodetic network using Rosenbrock Method to optimize the geodetic networks by using MATLAB programming language, to find the optimal design of geodetic network with high precision. ZOD problem was applied to a case study network consists of 19 points and 58 designed distances with a priori deviation equal to 5mm, to determine the best points in the network to consider as control points. The results showed that P55 and P73 having the minimum ellipse of error and considered as control points. FOD problem was applie
... Show MoreThe agent-based modeling is currently utilized extensively to analyze complex systems. It supported such growth, because it was able to convey distinct levels of interaction in a complex detailed environment. Meanwhile, agent-based models incline to be progressively complex. Thus, powerful modeling and simulation techniques are needed to address this rise in complexity. In recent years, a number of platforms for developing agent-based models have been developed. Actually, in most of the agents, often discrete representation of the environment, and one level of interaction are presented, where two or three are regarded hardly in various agent-based models. The key issue is that modellers work in these areas is not assisted by simulation plat
... Show MoreAdvances in digital technology and the World Wide Web has led to the increase of digital documents that are used for various purposes such as publishing and digital library. This phenomenon raises awareness for the requirement of effective techniques that can help during the search and retrieval of text. One of the most needed tasks is clustering, which categorizes documents automatically into meaningful groups. Clustering is an important task in data mining and machine learning. The accuracy of clustering depends tightly on the selection of the text representation method. Traditional methods of text representation model documents as bags of words using term-frequency index document frequency (TFIDF). This method ignores the relationship an
... Show More