Everybody is connected with social media like (Facebook, Twitter, LinkedIn, Instagram…etc.) that generate a large quantity of data and which traditional applications are inadequate to process. Social media are regarded as an important platform for sharing information, opinion, and knowledge of many subscribers. These basic media attribute Big data also to many issues, such as data collection, storage, moving, updating, reviewing, posting, scanning, visualization, Data protection, etc. To deal with all these problems, this is a need for an adequate system that not just prepares the details, but also provides meaningful analysis to take advantage of the difficult situations, relevant to business, proper decision, Health, social media, science, telecommunications, the environment, etc. Authors notice through reading of previous studies that there are different analyzes through HADOOP and its various tools such as the sentiment in real-time and others. However, dealing with this Big data is a challenging task. Therefore, such type of analysis is more efficiently possible only through the Hadoop Ecosystem. The purpose of this paper is to analyze literature related analysis of big data of social media using the Hadoop framework for knowing almost analysis tools existing in the world under the Hadoop umbrella and its orientations in addition to difficulties and modern methods of them to overcome challenges of big data in offline and real –time processing. Real-time Analytics accelerates decision-making along with providing access to business metrics and reporting. Comparison between Hadoop and spark has been also illustrated.
In this paper, new method have been investigated using evolving algorithms (EA's) to cryptanalysis one of the nonlinear stream cipher cryptosystems which depends on the Linear Feedback Shift Register (LFSR) unit by using cipher text-only attack. Genetic Algorithm (GA) and Ant Colony Optimization (ACO) which are used for attacking one of the nonlinear cryptosystems called "shrinking generator" using different lengths of cipher text and different lengths of combined LFSRs. GA and ACO proved their good performance in finding the initial values of the combined LFSRs. This work can be considered as a warning for a stream cipher designer to avoid the weak points, which may be f
... Show MoreCapacity is the ability of the organization to create value, and this ability depends on wide variety of resources, but the lack of balance between available resources and production capacity requirements leads to appearance of idle or excess capacity or appearance of deficit in capacity.
Hereby, the research deals with different concepts and alternative ma
... Show MoreIn this research, the methods of Kernel estimator (nonparametric density estimator) were relied upon in estimating the two-response logistic regression, where the comparison was used between the method of Nadaraya-Watson and the method of Local Scoring algorithm, and optimal Smoothing parameter λ was estimated by the methods of Cross-validation and generalized Cross-validation, bandwidth optimal λ has a clear effect in the estimation process. It also has a key role in smoothing the curve as it approaches the real curve, and the goal of using the Kernel estimator is to modify the observations so that we can obtain estimators with characteristics close to the properties of real parameters, and based on medical data for patients with chro
... Show MoreA skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.
Bacterial meningitis is a leading cause of illness and death worldwide. It is crucial for clinical and public health care, as well as disease control, to identify the meningitis-causing agent promptly. Between June 2021-February 2022, a total of 100 cerebrospinal fluid (CSF) and blood samples were collected from suspected cases of meningitis admitted to Raparin Paediatric Teaching Hospital, Erbil city-Iraq. Cytochemical, cultural, and biochemical tests were conducted, and confirmed by molecular techniques. Bacterial culture findings were positive in 7% of CSF samples and just one positive among blood samples. The most common pathogens found by cultural characteristics and VITEK 2 Compact System were Staphylococcus sciuri in two
... Show MoreImage compression is one of the data compression types applied to digital images in order to reduce their high cost for storage and/or transmission. Image compression algorithms may take the benefit of visual sensitivity and statistical properties of image data to deliver superior results in comparison with generic data compression schemes, which are used for other digital data. In the first approach, the input image is divided into blocks, each of which is 16 x 16, 32 x 32, or 64 x 64 pixels. The blocks are converted first into a string; then, encoded by using a lossless and dictionary-based algorithm known as arithmetic coding. The more occurrence of the pixels values is codded in few bits compare with pixel values of less occurre
... Show MoreSince its invention by the Ancient Romans and later developed during the mid-18th century, the concrete structure and finish, has been considered as the most powerful, practical, economic and constructional material that meets the building’s architectural and aesthetical requirements. By creating unique architectural forms, the pioneer architects used concrete widely to shape up their innovative designs and buildings.
The pre-mixed ultra-high performance concrete which manufactured by Lafarge.
The transparent concrete and cement that allow the light beams to pass through them, introduces remarkable well-lit architectural spaces within the same structural criteria. This product is a recyclable, sustainab
... Show MoreThe regression analysis process is used to study and predicate the surface response by using the design of experiment (DOE) as well as roughness calculation through developing a mathematical model. In this study; response surface methodology and the particular solution technique are used. Design of experiment used a series of the structured statistical analytic approach to investigate the relationship between some parameters and their responses. Surface roughness is one of the important parameters which play an important role. Also, its found that the cutting speed can result in small effects on surface roughness. This work is focusing on all considerations to make interaction between the parameters (position of influenc
... Show MoreEconomic analysis plays a pivotal role in managerial decision-making processes. This analysis is predicated on deeply understanding economic forces and market factors influencing corporate strategies and decisions. This paper delves into the role of economic data analysis in managing small and medium-sized enterprises (SMEs) to make strategic decisions and enhance performance. The study underscores the significance of this approach and its impact on corporate outcomes. The research analyzes annual reports from three companies: Al-Mahfaza for Mobile and Internet Financial Payment and Settlement Services Company Limited, Al-Arab for Electronic Payment Company, and Iraq Electronic Gateway for Financial Services Company. The paper concl
... Show MorePurpose: This study aimed to compare the stability and marginal bone loss of implants inserted with flapped and flapless approaches 8 weeks after surgery and 3 months after loading. Material and Methods: Thirty SLActive implants were inserted in 11 patients and early loaded with final restoration 8 weeks after healing period. The stability values determined by Osstell and the marginal bone loss measured by CBCT at the initial time (1st) and 8 weeks of the healing period (2nd) and 3 months after loading (3rd). Results: The overall survival rate was 100%. A significant increase in the 3rd implant stability value in the age of ˂ 40. A significant decrease in the 2nd implant stability value in both gender and traumatic zone with a flapless app
... Show More