The data preprocessing step is an important step in web usage mining because of the nature of log data, which are heterogeneous, unstructured, and noisy. Given the scalability and efficiency of algorithms in pattern discovery, a preprocessing step must be applied. In this study, the sequential methodologies utilized in the preprocessing of data from web server logs, with an emphasis on sub-phases, such as session identification, user identification, and data cleansing, are comprehensively evaluated and meticulously examined.
|
The international organizations showed their interest in the marketable public relations, with their activities, means and strategies, they play a crucial role in marketing products, services and ideas of the institution. They are considered to be the link between the company and its public, They are responsible for presenting the institution to the public, with honest transmission of the information. This gives a good impression to the institution, in a way the institution and its products become consistent with the needs and interests of the public. Based on this, the research aims to identify the strategies used by the marketable public relations internationally. The r |
Abstract
The purpose of this research is to develop a proposed framework for achieving the Integration of the Target Cost and Resource Consumption Accounting Techniques and to show the role they play in reducing products costs and supporting the competitive advantage to cope with contemporary changes. To achieve this goal, the researchers followed the analytical method using the statistical questionnaire as a means of collecting data from the research sample include accounting, administrative, technical, engineering staffs and others. The research sample consists of (56) individuals and for the purpose of conducting statistical analysis of the data and testing hypotheses, the statistical program (SPSS) wa
... Show MoreCryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to
... Show MoreA two time step stochastic multi-variables multi-sites hydrological data forecasting model was developed and verified using a case study. The philosophy of this model is to use the cross-variables correlations, cross-sites correlations and the two steps time lag correlations simultaneously, for estimating the parameters of the model which then are modified using the mutation process of the genetic algorithm optimization model. The objective function that to be minimized is the Akiake test value. The case study is of four variables and three sites. The variables are the monthly air temperature, humidity, precipitation, and evaporation; the sites are Sulaimania, Chwarta, and Penjwin, which are located north Iraq. The model performance was
... Show MorePurpose – The Cloud computing (CC) and its services have enabled the information centers of organizations to adapt their informatic and technological infrastructure and making it more appropriate to develop flexible information systems in the light of responding to the informational and knowledge needs of their users. In this context, cloud-data governance has become more complex and dynamic, requiring an in-depth understanding of the data management strategy at these centers in terms of: organizational structure and regulations, people, technology, process, roles and responsibilities. Therefore, our paper discusses these dimensions as challenges that facing information centers in according to their data governance and the impa
... Show MoreIn data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me
Anomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the
... Show MoreCloud-based Electronic Health Records (EHRs) have seen a substantial increase in usage in recent years, especially for remote patient monitoring. Researchers are interested in investigating the use of Healthcare 4.0 in smart cities. This involves using Internet of Things (IoT) devices and cloud computing to remotely access medical processes. Healthcare 4.0 focuses on the systematic gathering, merging, transmission, sharing, and retention of medical information at regular intervals. Protecting the confidential and private information of patients presents several challenges in terms of thwarting illegal intrusion by hackers. Therefore, it is essential to prioritize the protection of patient medical data that is stored, accessed, and shared on
... Show More1-Objective:- Polyphenols are biochemical compounds with antioxidant activity against differences diseases related to Lipid peroxidation such as diabetes mellitus. Polyphenols distributed widely in medical plants, the aim of the study is to extract and analyze some polyphenolic compounds from grape seeds and examine their effects on (STZ) induced diabetic mice. 2-Methods:- In the present study , a group of polyphenols has been extracted from Iraq
... Show MoreThe contemporary art culture is considered a resultant of preceding human civilizations from the early days. At the same time, it got closer to the local privacy, because the contemporary Iraqi potter worked hard to create new art, refusing to base his art on the early works, creating art pieces that contains properties to inherited art and their form significance, in order to create a new identity of his own, bringing an empowerment to his deepened civilization, with the goal to create national identity ideas from one hand, and entity and locality from another. Europeans also followed the direction of our civilization, such as (Picasso, Henry Moore, Barbra Hurth, Brankouzi, Hans Arp,..etc). From this point of view comes the importance o
... Show More