The data preprocessing step is an important step in web usage mining because of the nature of log data, which are heterogeneous, unstructured, and noisy. Given the scalability and efficiency of algorithms in pattern discovery, a preprocessing step must be applied. In this study, the sequential methodologies utilized in the preprocessing of data from web server logs, with an emphasis on sub-phases, such as session identification, user identification, and data cleansing, are comprehensively evaluated and meticulously examined.
The paradigm and domain of data security is the key point as per the current era in which the data is getting transmitted to multiple channels from multiple sources. The data leakage and security loopholes are enormous and there is need to enforce the higher levels of security, privacy and integrity. Such sections incorporate e-administration, long range interpersonal communication, internet business, transportation, coordinations, proficient correspondences and numerous others. The work on security and trustworthiness is very conspicuous in the systems based situations and the private based condition. This examination original copy is exhibiting the efficacious use of security based methodology towards the execution with blockchain
... Show MoreGovernment spending is the tool that the state uses to achieve its various goals. The research aims to identify the most important determinants of government spending in Iraq and to indicate the type and nature of the relationship between government spending and its determinants, which will contribute to understanding the movement of government spending. The results of the co-integration test using the border test methodology showed that the variables of population growth and oil prices have a long-term effect on government spending while inflation is not significant in the long run, and that 47% of the equilibrium imbalance (short-term imbalance) in government spending in the previous period (t-) can be corrected in the current period (t)
... Show MoreThe process of transformation from the central economy to a free economy requires restructuring the economy according to a new economic philosophy that relies on activating the role of private economic activity in which private and medium-sized institutions occupy an essential axis for their active role in the economies of all countries, especially those countries that have directed towards the market mechanism and sector leadership. The special process of economic development and the role that commercial banks can play in advancing the financing of these projects by establishing specialized business incubators for financing.
What encouraged countries to pay attention to these institutions is the ease of
... Show MoreIn the image processing’s field and computer vision it’s important to represent the image by its information. Image information comes from the image’s features that extracted from it using feature detection/extraction techniques and features description. Features in computer vision define informative data. For human eye its perfect to extract information from raw image, but computer cannot recognize image information. This is why various feature extraction techniques have been presented and progressed rapidly. This paper presents a general overview of the feature extraction categories for image.
For the most reliable and reproducible results for calibration or general testing purposes of two immiscible liquids, such as water in engine oil, good emulsification is vital. This study explores the impact of emulsion quality on the Fourier transform infrared (FT-IR) spectroscopy calibration standards for measuring water contamination in used or in-service engine oil, in an attempt to strengthen the specific guidelines of ASTM International standards for sample preparation. By using different emulsification techniques and readily available laboratory equipment, this work is an attempt to establish the ideal sample preparation technique for reliability, repeatability, and reproducibility for FT-IR analysis while still considering t
... Show MoreIn regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement
... Show MoreImage compression is a serious issue in computer storage and transmission, that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless com
... Show MoreThe distribution of the intensity of the comet Ison C/2013 is studied by taking its histogram. This distribution reveals four distinct regions that related to the background, tail, coma and nucleus. One dimensional temperature distribution fitting is achieved by using two mathematical equations that related to the coordinate of the center of the comet. The quiver plot of the gradient of the comet shows very clearly that arrows headed towards the maximum intensity of the comet.