The data preprocessing step is an important step in web usage mining because of the nature of log data, which are heterogeneous, unstructured, and noisy. Given the scalability and efficiency of algorithms in pattern discovery, a preprocessing step must be applied. In this study, the sequential methodologies utilized in the preprocessing of data from web server logs, with an emphasis on sub-phases, such as session identification, user identification, and data cleansing, are comprehensively evaluated and meticulously examined.
This work aims to analyze a three-dimensional discrete-time biological system, a prey-predator model with a constant harvesting amount. The stage structure lies in the predator species. This analysis is done by finding all possible equilibria and investigating their stability. In order to get an optimal harvesting strategy, we suppose that harvesting is to be a non-constant rate. Finally, numerical simulations are given to confirm the outcome of mathematical analysis.
This paper aims at analyzing Terry Bisson’s short story Bears Discover Fire stylistically by following both Gerard Genette’s theory of narratology (1980) and Short and Leech (1981) strategy for analyzing fictional works. Also trying to examine to what extent these models are applicable in analyzing the selected story. Stylistic analysis procedures help the readers/researchers to identify specific linguistic features in order to support literary interpretation and appreciation of literary texts. Style in fiction concentrates not on what is written, but on how a text is written. Each writer has his own style and techniques which distinguish him from other writers
The use of composite materials has vastly increased in recent years. Great interest is therefore developed in the damage detection of composites using non- destructive test methods. Several approaches have been applied to obtain information about the existence and location of the faults. This paper used the vibration response of a composite plate to detect and localize delamination defect based on the modal analysis. Experiments are conducted to validate the developed model. A two-dimensional finite element model for multi-layered composites with internal delamination is established. FEM program are built for plates under different boundary conditions. Natural frequencies and modal displacements of the intact and damaged
... Show MoreWellbore instability is a significant problem faced during drilling operations and causes loss of circulation, caving, stuck pipe, and well kick or blowout. These problems take extra time to treat and increase the Nonproductive Time (NPT). This paper aims to review the factors that influence the stability of wellbores and know the methods that have been reached to reduce them. Based on a current survey, the factors that affect the stability of the wellbore are far-field stress, rock mechanical properties, natural fractures, pore pressure, wellbore trajectory, drilling fluid chemicals, mobile formations, naturally over-pressured shale collapse, mud weight, temperature, and time. Also, the most suitable ways to reduce well
... Show MoreIII-V zinc-blende AlP, AlAs semiconductors and their alloy Aluminum Arsenide phosphide Al AsxP1-x ternary nanocrystals have been investigated using Ab- initio density functional theory (Ab-initio-DFT) at the generalized-gradient approximation (GGA) level with STO-3G basis set coupled with large unit cell method (LUC). The dimension of crystal is found around (1.56 – 2.24) nm at a function of increasing the sizes (8, 16, 54, 64) with different concentration of arsenide (x=0, 0.25, 0.5, 0.75 and 1) respectively. Gaussian 03 code program has been used throughout this study to calculate some of the physical properties such as the electronic properties energy gap, lattice constant, valence and conduction band as well as density of state. Re
... Show MoreMalicious software (malware) performs a malicious function that compromising a computer system’s security. Many methods have been developed to improve the security of the computer system resources, among them the use of firewall, encryption, and Intrusion Detection System (IDS). IDS can detect newly unrecognized attack attempt and raising an early alarm to inform the system about this suspicious intrusion attempt. This paper proposed a hybrid IDS for detection intrusion, especially malware, with considering network packet and host features. The hybrid IDS designed using Data Mining (DM) classification methods that for its ability to detect new, previously unseen intrusions accurately and automatically. It uses both anomaly and misuse dete
... Show MoreMost of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve B
... Show MoreToday, the role of cloud computing in our day-to-day lives is very prominent. The cloud computing paradigm makes it possible to provide demand-based resources. Cloud computing has changed the way that organizations manage resources due to their robustness, low cost, and pervasive nature. Data security is usually realized using different methods such as encryption. However, the privacy of data is another important challenge that should be considered when transporting, storing, and analyzing data in the public cloud. In this paper, a new method is proposed to track malicious users who use their private key to decrypt data in a system, share it with others and cause system information leakage. Security policies are also considered to be int
... Show MoreIn data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me