Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreIn this research, a group of gray texture images of the Brodatz database was studied by building the features database of the images using the gray level co-occurrence matrix (GLCM), where the distance between the pixels was one unit and for four angles (0, 45, 90, 135). The k-means classifier was used to classify the images into a group of classes, starting from two to eight classes, and for all angles used in the co-occurrence matrix. The distribution of the images on the classes was compared by comparing every two methods (projection of one class onto another where the distribution of images was uneven, with one category being the dominant one. The classification results were studied for all cases using the confusion matrix between every
... Show MoreIn this work a study and calculation of the normal approach between two bodies, spherical and rough flat surface, had been conducted by the aid of image processing technique. Four kinds of metals of different work hardening index had been used as a surface specimens and by capturing images of resolution of 0.006565 mm/pixel a good estimate of the normal approach may be obtained the compression tests had been done in strength of material laboratory in mechanical engineering department, a Monsanto tensometer had been used to conduct the indentation tests.
A light section measuring equipment microscope BK 70x50 was used to calculate the surface parameters of the texture profile like standard deviation of asperity peak heights, centre lin
In this work a study and calculation of the normal approach between two bodies,
spherical and rough flat surface, had been conducted by the aid of image processing
technique. Four kinds of metals of different work hardening index had been used as a
surface specimens and by capturing images of resolution of 0.006565 mm/pixel a good estimate of the normal approach may be obtained the compression tests had been done in strength of material laboratory in mechanical engineering department, a Monsanto tensometer had been used to conduct the indentation tests. A light section measuring equipment microscope BK 70x50 was used to calculate the surface parameters of the texture profile like standard deviation of asperity peak heights
In this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.
Data compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show MoreDeveloping and researching antenna designs are analogous to excavating in an undiscovered mine. This paper proposes a multi-band antenna with a new hexagonal ring shape, theoretically designed, developed, and analyzed using a CST before being manufactured. The antenna has undergone six changes to provide the best performance. The results of the surface current distribution and the electric field distribution on the surface of the hexagonal patch were theoretically analyzed and studied. The sequential approach taken to determine the most effective design is logical, and prevents deviation from the work direction. After comparing the six theoretical results, the fifth model proved to be the best for making a prototype. Measured results rep
... Show MoreIn this study was undertaken frish fish such as Bigeye Ilisha megaloptera, Nematalos nasus, Suboor Hilsha ilisha and Carp Cyprinus carpio. they were purchased from local marketes in Basrah, Oil was extracted by a solvent extraction method on low temperature. And the level of oil obtiened about (6.08; 10.72; 13.52 and 5.61)% for Bigeye, Jaffout, Suboor and Carp. the Crud oils were compared with vegetable oil (olive oil) and animal fat (tial fat mutton).
The extracted oil from fresh complete fishs with compared oils intraed on pharmacological system through packed in capsul with and with out garlic`s extract. this system analysis with chemical tests.
Results were analyzed statistically by using the SPSS program with using (CRD)