In the last two decades, networks had been changed according to the rapid changing in its requirements. The current Data Center Networks have large number of hosts (tens or thousands) with special needs of bandwidth as the cloud network and the multimedia content computing is increased. The conventional Data Center Networks (DCNs) are highlighted by the increased number of users and bandwidth requirements which in turn have many implementation limitations. The current networking devices with its control and forwarding planes coupling result in network architectures are not suitable for dynamic computing and storage needs. Software Defined networking (SDN) is introduced to change this notion of traditional networks by decoupling control and forwarding planes. So, due to the rapid increase in the number of applications, websites, storage space, and some of the network resources are being underutilized due to static routing mechanisms. To overcome these limitations, a Software Defined Network based Openflow Data Center network architecture is used to obtain better performance parameters and implementing traffic load balancing function. The load balancing distributes the traffic requests over the connected servers, to diminish network congestions, and reduce underutilization problem of servers. As a result, SDN is developed to afford more effective configuration, enhanced performance, and more flexibility to deal with huge network designs.
This paper delves into some significant performance measures (PMs) of a bulk arrival queueing system with constant batch size b, according to arrival rates and service rates being fuzzy parameters. The bulk arrival queuing system deals with observation arrival into the queuing system as a constant group size before allowing individual customers entering to the service. This leads to obtaining a new tool with the aid of generating function methods. The corresponding traditional bulk queueing system model is more convenient under an uncertain environment. The α-cut approach is applied with the conventional Zadeh's extension principle (ZEP) to transform the triangular membership functions (Mem. Fs) fuzzy queues into a family of conventional b
... Show MoreShear wave velocity is an important feature in the seismic exploration that could be utilized in reservoir development strategy and characterization. Its vital applications in petrophysics, seismic, and geomechanics to predict rock elastic and inelastic properties are essential elements of good stability and fracturing orientation, identification of matrix mineral and gas-bearing formations. However, the shear wave velocity that is usually obtained from core analysis which is an expensive and time-consuming process and dipole sonic imager tool is not commonly available in all wells. In this study, a statistical method is presented to predict shear wave velocity from wireline log data. The model concentrated to predict shear wave velocity fr
... Show MoreToday, there are large amounts of geospatial data available on the web such as Google Map (GM), OpenStreetMap (OSM), Flickr service, Wikimapia and others. All of these services called open source geospatial data. Geospatial data from different sources often has variable accuracy due to different data collection methods; therefore data accuracy may not meet the user requirement in varying organization. This paper aims to develop a tool to assess the quality of GM data by comparing it with formal data such as spatial data from Mayoralty of Baghdad (MB). This tool developed by Visual Basic language, and validated on two different study areas in Baghdad / Iraq (Al-Karada and Al- Kadhumiyah). The positional accuracy was asses
... Show MoreData scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for
The current world is observing huge developments in presenting the opportunity for organizations and administrative units to use information and communication technology and their adoption by administrative work due to its importance in the achievement of work with higher efficiency, speed, and facility of communication with all individuals and companies using various means of communication Depending on the Internet networks. Therefore, the research dealt with the study of electronic systems designed and adopted in the creation or construction of a database for archiving data, which is the main method in organizations and administrative units in developed countries. Where this system works to convert documents, and manual processes and t
... Show MoreThe research aimed at measuring the compatibility of Big date with the organizational Ambidexterity dimensions of the Asia cell Mobile telecommunications company in Iraq in order to determine the possibility of adoption of Big data Triple as a approach to achieve organizational Ambidexterity.
The study adopted the descriptive analytical approach to collect and analyze the data collected by the questionnaire tool developed on the Likert scale After a comprehensive review of the literature related to the two basic study dimensions, the data has been subjected to many statistical treatments in accordance with res
... Show MoreThere is an assumption implicit but fundamental theory behind the decline by the time series used in the estimate, namely that the time series has a sleep feature Stationary or the language of Engle Gernger chains are integrated level zero, which indicated by I (0). It is well known, for example, tables of t-statistic is designed primarily to deal with the results of the regression that uses static strings. This assumption has been previously treated as an axiom the mid-seventies, where researchers are conducting studies of applied without taking into account the properties of time series used prior to the assessment, was to accept the results of these tests Bmanueh and delivery capabilities based on the applicability of the theo
... Show MoreA fast moving infrared excess source (G2) which is widely interpreted as a core-less gas and dust cloud approaches Sagittarius A* (Sgr A*) on a presumably elliptical orbit. VLT