Big data of different types, such as texts and images, are rapidly generated from the internet and other applications. Dealing with this data using traditional methods is not practical since it is available in various sizes, types, and processing speed requirements. Therefore, data analytics has become an important tool because only meaningful information is analyzed and extracted, which makes it essential for big data applications to analyze and extract useful information. This paper presents several innovative methods that use data analytics techniques to improve the analysis process and data management. Furthermore, this paper discusses how the revolution of data analytics based on artificial intelligence algorithms might provide improvements for many applications. In addition, critical challenges and research issues were provided based on published paper limitations to help researchers distinguish between various analytics techniques to develop highly consistent, logical, and information-rich analyses based on valuable features. Furthermore, the findings of this paper may be used to identify the best methods in each sector used in these publications, assist future researchers in their studies for more systematic and comprehensive analysis and identify areas for developing a unique or hybrid technique for data analysis.
Cloud computing represents the most important shift in computing and information technology (IT). However, security and privacy remain the main obstacles to its widespread adoption. In this research we will review the security and privacy challenges that affect critical data in cloud computing and identify solutions that are used to address these challenges. Some questions that need answers are: (a) User access management, (b) Protect privacy of sensitive data, (c) Identity anonymity to protect the Identity of user and data file. To answer these questions, a systematic literature review was conducted and structured interview with several security experts working on cloud computing security to investigate the main objectives of propo
... Show MoreCrime is considered as an unlawful activity of all kinds and it is punished by law. Crimes have an impact on a society's quality of life and economic development. With a large rise in crime globally, there is a necessity to analyze crime data to bring down the rate of crime. This encourages the police and people to occupy the required measures and more effectively restricting the crimes. The purpose of this research is to develop predictive models that can aid in crime pattern analysis and thus support the Boston department's crime prevention efforts. The geographical location factor has been adopted in our model, and this is due to its being an influential factor in several situations, whether it is traveling to a specific area or livin
... Show MoreWith the revolutionized expansion of the Internet, worldwide information increases the application of communication technology, and the rapid growth of significant data volume boosts the requirement to accomplish secure, robust, and confident techniques using various effective algorithms. Lots of algorithms and techniques are available for data security. This paper presents a cryptosystem that combines several Substitution Cipher Algorithms along with the Circular queue data structure. The two different substitution techniques are; Homophonic Substitution Cipher and Polyalphabetic Substitution Cipher in which they merged in a single circular queue with four different keys for each of them, which produces eight different outputs for
... Show MoreSince the Internet has been more widely used and more people have access to multimedia content, copyright hacking, and piracy have risen. By the use of watermarking techniques, security, asset protection, and authentication have all been made possible. In this paper, a comparison between fragile and robust watermarking techniques has been presented to benefit them in recent studies to increase the level of security of critical media. A new technique has been suggested when adding an embedded value (129) to each pixel of the cover image and representing it as a key to thwart the attacker, increase security, rise imperceptibility, and make the system faster in detecting the tamper from unauthorized users. Using the two watermarking ty
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
In many oil-recovery systems, relative permeabilities (kr) are essential flow factors that affect fluid dispersion and output from petroleum resources. Traditionally, taking rock samples from the reservoir and performing suitable laboratory studies is required to get these crucial reservoir properties. Despite the fact that kr is a function of fluid saturation, it is now well established that pore shape and distribution, absolute permeability, wettability, interfacial tension (IFT), and saturation history all influence kr values. These rock/fluid characteristics vary greatly from one reservoir region to the next, and it would be impossible to make kr measurements in all of them. The unsteady-state approach was used to calculate the relat
... Show MoreOne of the most significant elements influencing weather, climate, and the environment is vegetation cover. Normalized Difference Vegetation Index (NDVI) and Normalized Difference Built-up Index (NDBI) over the years 2019–2022 were estimated based on four Landsat 8 TIRS’s images covering Duhok City. Using the radiative transfer model, the city's land surface temperature (LST) during the next four years was calculated. The aim of this study is to compute the temperature at the land's surface (LST) from the years 2019-2022 and understand the link, between LST, NDVI, and NDBI and the capability for mapping by LANDSAT-8 TIRS's. The findings revealed that the NDBI and the NDVI had the strongest correlation with the
... Show MoreHartha Formation is an overburdened horizon in the X-oilfield which generates a lot of Non-Productive Time (NPT) associated with drilling mud losses. This study has been conducted to investigate the loss events in this formation as well as to provide geological interpretations based on datasets from nine wells in this field of interest. The interpretation was based on different analyses including wireline logs, cuttings descriptions, image logs, and analog data. Seismic and coherency data were also used to formulate the geological interpretations and calibrate that with the loss events of the Hartha Fm.
The results revealed that the upper part of the Hartha Fm. was identified as an interval capable of creating potentia
... Show MoreThe purpose of this paper is to apply different transportation models in their minimum and maximum values by finding starting basic feasible solution and finding the optimal solution. The requirements of transportation models were presented with one of their applications in the case of minimizing the objective function, which was conducted by the researcher as real data, which took place one month in 2015, in one of the poultry farms for the production of eggs
... Show More