The data preprocessing step is an important step in web usage mining because of the nature of log data, which are heterogeneous, unstructured, and noisy. Given the scalability and efficiency of algorithms in pattern discovery, a preprocessing step must be applied. In this study, the sequential methodologies utilized in the preprocessing of data from web server logs, with an emphasis on sub-phases, such as session identification, user identification, and data cleansing, are comprehensively evaluated and meticulously examined.
Duodenal and gastric ulcers remain the two most common perforations of the gastrointestinal tract and might be reduced by the early detection of predictive factors, which has limitedly researched. This study conducted to examine the predictive factors for developing of gastroduodenal ulcer among patients attending Gastrointestinal Teaching Hospitals in Baghdad, Iraq.
A cross-sectional survey with a total of 100 patients with gastric and duodenal ulcers was recruited using a nonprobability (purposive) sampling techniqu
The educational sector is one of the important sectors in the world, and it is considered one of the means of community development. In addition, it is one of the means of making the country’s renaissance and devel-opment because it represents the factory of thinking minds that make change. There is no doubt that this sector is the same as any other sector. The deficit in the studied scientific planning has been prolonged, which led to its deterioration, and the problems of education remain diverse and inherited from previous time periods, where the hierarchical cluster analysis was used on postgraduate students in universities in Iraq, except for Kurdistan region, and the number of universities that were included in the study was
... Show MoreThis study included 50 blood samples that were collected from patients with age ranged between 35-65 years. Thirty samples were collected from patients with Type 2 Diabetes Mellitus (T2DM), while 20 blood samples were collected from healthy individuals as a control sample. The polymorphism results of TGF-β1 gene in codon 10: +869*C/T position by using amplification refractory mutation system (ARMS-PCR) showed that the T allele was suggested to have a protective effect, while C allele was associated with an increased risk of T2DM. The TT and CT were suggested to have a protective effect, while CC genotype was associated with an increased risk of T2DM. The polymorphism results of TGF-β1 gene in codon 25: +915*G/C position in samples
... Show MoreIn this paper , an efficient new procedure is proposed to modify third –order iterative method obtained by Rostom and Fuad [Saeed. R. K. and Khthr. F.W. New third –order iterative method for solving nonlinear equations. J. Appl. Sci .7(2011): 916-921] , using three steps based on Newton equation , finite difference method and linear interpolation. Analysis of convergence is given to show the efficiency and the performance of the new method for solving nonlinear equations. The efficiency of the new method is demonstrated by numerical examples.
The study of cultural identity and its data in the designer’s work is considered a cultural nature to highlight the cultural identity and link it to the national culture of their country and society. It provides a reflection and translation of the society’s culture, traditions, social and economic dimensions, the natural environment, and scientific phenomena. Within meanings translated into a variety of methods, including expressive and realistic, which defines the designer’s relationship with his society and the national culture of his country and his connection with the civilization of the country.
The research problem came with the following question: What are the data of cultural identity in the work of designer David Gent
This paper presents a combination of enhancement techniques for fingerprint images affected by different type of noise. These techniques were applied to improve image quality and come up with an acceptable image contrast. The proposed method included five different enhancement techniques: Normalization, Histogram Equalization, Binarization, Skeletonization and Fusion. The Normalization process standardized the pixel intensity which facilitated the processing of subsequent image enhancement stages. Subsequently, the Histogram Equalization technique increased the contrast of the images. Furthermore, the Binarization and Skeletonization techniques were implemented to differentiate between the ridge and valley structures and to obtain one
... Show MoreHuman Interactive Proofs (HIPs) are automatic inverse Turing tests, which are intended to differentiate between people and malicious computer programs. The mission of making good HIP system is a challenging issue, since the resultant HIP must be secure against attacks and in the same time it must be practical for humans. Text-based HIPs is one of the most popular HIPs types. It exploits the capability of humans to recite text images more than Optical Character Recognition (OCR), but the current text-based HIPs are not well-matched with rapid development of computer vision techniques, since they are either vey simply passed or very hard to resolve, thus this motivate that
... Show More