The data preprocessing step is an important step in web usage mining because of the nature of log data, which are heterogeneous, unstructured, and noisy. Given the scalability and efficiency of algorithms in pattern discovery, a preprocessing step must be applied. In this study, the sequential methodologies utilized in the preprocessing of data from web server logs, with an emphasis on sub-phases, such as session identification, user identification, and data cleansing, are comprehensively evaluated and meticulously examined.
The utilization of recycled brick tile powder as a replacement for conventional filler in the asphalt concrete mix has been studied in this research. This research evaluates the effectiveness of recycled brick tile powder and determines its optimum replacement level. Using recycled brick tile powder is significant from an environmental standpoint as it is a waste product from construction activities. Sixteen asphalt concrete samples were produced, and eight were soaked for a day. Samples contained 5% Bitumen, 2% to 5% brick tile powder, and conventional stone dust filler. The properties of samples were evaluated using the Marshall test. It was observed that the resistance to stiffness and deformation of asphalt concrete
... Show MoreRoller-Compacted Concrete is a no-slump concrete, with no reinforcing steel, no forms, no finishing and wet enough to support compaction by vibratory rollers. Due to the effect of curing on properties and durability of concrete, the main purpose of this research is to study the effect of various curing methods (air curing, 7 days water curing, and permanent water curing) and porcelanite (local material used as an Internal Curing agent) with different replacement percentages of fine aggregate (volumetric replacement) on some properties of Roller-Compacted Concrete and to explore the possibility of introducing practical Roller-Compacted Concrete for road pavement with minimum requirement of curing. Specimens were sawed fro
... Show MoreMultiple linear regressions are concerned with studying and analyzing the relationship between the dependent variable and a set of explanatory variables. From this relationship the values of variables are predicted. In this paper the multiple linear regression model and three covariates were studied in the presence of the problem of auto-correlation of errors when the random error distributed the distribution of exponential. Three methods were compared (general least squares, M robust, and Laplace robust method). We have employed the simulation studies and calculated the statistical standard mean squares error with sample sizes (15, 30, 60, 100). Further we applied the best method on the real experiment data representing the varieties of
... Show MoreThis paper delves into some significant performance measures (PMs) of a bulk arrival queueing system with constant batch size b, according to arrival rates and service rates being fuzzy parameters. The bulk arrival queuing system deals with observation arrival into the queuing system as a constant group size before allowing individual customers entering to the service. This leads to obtaining a new tool with the aid of generating function methods. The corresponding traditional bulk queueing system model is more convenient under an uncertain environment. The α-cut approach is applied with the conventional Zadeh's extension principle (ZEP) to transform the triangular membership functions (Mem. Fs) fuzzy queues into a family of conventional b
... Show More<p>In combinatorial testing development, the fabrication of covering arrays is the key challenge by the multiple aspects that influence it. A wide range of combinatorial problems can be solved using metaheuristic and greedy techniques. Combining the greedy technique utilizing a metaheuristic search technique like hill climbing (HC), can produce feasible results for combinatorial tests. Methods based on metaheuristics are used to deal with tuples that may be left after redundancy using greedy strategies; then the result utilization is assured to be near-optimal using a metaheuristic algorithm. As a result, the use of both greedy and HC algorithms in a single test generation system is a good candidate if constructed correctly. T
... Show MoreWe study in this paper the composition operator of induced by the function ?(z)=sz+t where , and We characterize the normal composition operator C? on Hardy space H2 and other related classes of operators. In addition to that we study the essential normality of C? and give some other partial results which are new to the best of our knowledge.