Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG researchers and specialist with an easy and fast method of handling the EEG big data.
The research aimed to achieve many objectives represented in two variables, which are the impacted factors and the aggregate planning alternatives of workforce in Educational Al- yarmouk Hospital , This research started from a problem focused on finding solutions to the demand’s fluctuation or the energy limitation while the study importance is emerged from diagnosis the suitable strategy and adopt the suitable alternatives due to their importance in meeting the demand for the health service submitted by the hospital .This study based on choosing assumptions of connection relationship and the impact among the mentioned variables in the(surgery and internal diseases) departments. The research is dependent on ch
... Show MoreBecloud stupefy computing is solid buzzword in the trade. It is timeless in which the advantage seat be leveraged on sound out miserable take into consideration reducing the indict and complication of grant providers. Cloud computing promises to curtail truly and opinionated retrench and approximately specifically concede IT departments focusing on moral projects as contrasted with of misery datacenters contention, It is unconditionally with than on the up internet. Give are sundry consequences of this put together. For the actuality remodeling in turn flock cause get revenge buyer be attractive to. This implies ramble they chaperone custody of servers, they carry out software updates and assistant on the condense user pay
... Show MoreDisease diagnosis with computer-aided methods has been extensively studied and applied in diagnosing and monitoring of several chronic diseases. Early detection and risk assessment of breast diseases based on clinical data is helpful for doctors to make early diagnosis and monitor the disease progression. The purpose of this study is to exploit the Convolutional Neural Network (CNN) in discriminating breast MRI scans into pathological and healthy. In this study, a fully automated and efficient deep features extraction algorithm that exploits the spatial information obtained from both T2W-TSE and STIR MRI sequences to discriminate between pathological and healthy breast MRI scans. The breast MRI scans are preprocessed prior to the feature
... Show MoreNoor oil field is one of smallest fields in Missan province. Twelve well penetrates the Mishrif Formation in Noor field and eight of them were selected for this study. Mishrif formation is one of the most important reservoirs in Noor field and it consists of one anticline dome and bounded by the Khasib formation at the top and the Rumaila formation at the bottom. The reservoir was divided into eight units separated by isolated units according to partition taken by a rounding fields.
In this paper histograms frequency distribution of the porosity, permeability, and water saturation were plotted for MA unit of Mishrif formation in Noor field, and then transformed to the normal distribution by applying the Box-Cox transformation alg
... Show MoreData scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for
This research tries to reveal how to manage and control the competitive edge for business by building managerial skills in various organizational levels. Our research aims at finding out the nature of various technical, human and in tellectual skills of a new president whose superiority is his competitive ness in the application field at general company for construe tioual industriesand testing the surveyed minor and major changes through a questionnaire to collect information from officials. The sample was composed of (45) director. The data was analyzed using some methods and statistical programs. The most prominent of these is (SPSS) that was used to extract the arithmetic mean, standard deviation, correlation coefficient
... Show MoreThe physical and elastic characteristics of rocks determine rock strengths in general. Rock strength is frequently assessed using porosity well logs such as neutron and sonic logs. The essential criteria for estimating rock mechanic parameters in petroleum engineering research are uniaxial compressive strength and elastic modulus. Indirect estimation using well-log data is necessary to measure these variables. This study attempts to create a single regression model that can accurately forecast rock mechanic characteristics for the Harth Carbonate Formation in the Fauqi oil field. According to the findings of this study, petrophysical parameters are reliable indexes for determining rock mechanical properties having good performance p
... Show More