Association rules mining (ARM) is a fundamental and widely used data mining technique to achieve useful information about data. The traditional ARM algorithms are degrading computation efficiency by mining too many association rules which are not appropriate for a given user. Recent research in (ARM) is investigating the use of metaheuristic algorithms which are looking for only a subset of high-quality rules. In this paper, a modified discrete cuckoo search algorithm for association rules mining DCS-ARM is proposed for this purpose. The effectiveness of our algorithm is tested against a set of well-known transactional databases. Results indicate that the proposed algorithm outperforms the existing metaheuristic methods.
The concept of training is no longer traditionally understood Limited organize traditional training courses, but has become a strategic choice in the investment and development of human resources system, attic trying to find the answer to the core problem of the study which
is the extent to which the training process, the traditional form that meets the needs of the company the development of intellectual capital.This research aimstostatementof the impact dimensions the training process(training role, support or top management , training programs, modern technology)of the in components Intellectual Capital(Human Capital, Structural Capital, Customer Capital) and provide the top management of the Company for the development of sci
... Show MoreAbstract
Shorten the research problem that there is no system or model to evaluate the financial performance of the departments of municipalities where it is not possible for a person or institution both to know or to know their success in terms of the financial work of failure and where it is now than those without assessing the financial performed, authorized to be and necessities FATF is the financial performance assessment, which is the work unfinished aspects without Hence the work came in this study to study and diagnose and analyze financial data in a sample of municipal departments in order to develop a model to assess the financ
There is a great deal of systems dealing with image processing that are being used and developed on a daily basis. Those systems need the deployment of some basic operations such as detecting the Regions of Interest and matching those regions, in addition to the description of their properties. Those operations play a significant role in decision making which is necessary for the next operations depending on the assigned task. In order to accomplish those tasks, various algorithms have been introduced throughout years. One of the most popular algorithms is the Scale Invariant Feature Transform (SIFT). The efficiency of this algorithm is its performance in the process of detection and property description, and that is due to the fact that
... Show MoreHospitals are part of the service organizations and most importantly at the level of individuals because they are tied to the people health and their daily lives , the nursing service is one of the important services provided by hospitals, and nurses are the human resource that offers this service, from this standpoint the idea of research came to prepare work Scheduling for nurses in a scientific way to improve performance operational for their services and provide efficient service available 24 hours a day, the research use one of the modern and scientific rules of scheduling its “schedule of
... Show MoreEnergy efficiency is a significant aspect in designing robust routing protocols for wireless sensor networks (WSNs). A reliable routing protocol has to be energy efficient and adaptive to the network size. To achieve high energy conservation and data aggregation, there are two major techniques, clusters and chains. In clustering technique, sensor networks are often divided into non-overlapping subsets called clusters. In chain technique, sensor nodes will be connected with the closest two neighbors, starting with the farthest node from the base station till the closest node to the base station. Each technique has its own advantages and disadvantages which motivate some researchers to come up with a hybrid routing algorit
... Show More
An automatic text summarization system mimics how humans summarize by picking the most significant sentences in a source text. However, the complexities of the Arabic language have become challenging to obtain information quickly and effectively. The main disadvantage of the traditional approaches is that they are strictly constrained (especially for the Arabic language) by the accuracy of sentence feature functions, weighting schemes, and similarity calculations. On the other hand, the meta-heuristic search approaches have a feature tha
... Show MoreScheduling Timetables for courses in the big departments in the universities is a very hard problem and is often be solved by many previous works although results are partially optimal. This work implements the principle of an evolutionary algorithm by using genetic theories to solve the timetabling problem to get a random and full optimal timetable with the ability to generate a multi-solution timetable for each stage in the collage. The major idea is to generate course timetables automatically while discovering the area of constraints to get an optimal and flexible schedule with no redundancy through the change of a viable course timetable. The main contribution in this work is indicated by increasing the flexibility of generating opti
... Show MoreRegression testing being expensive, requires optimization notion. Typically, the optimization of test cases results in selecting a reduced set or subset of test cases or prioritizing the test cases to detect potential faults at an earlier phase. Many former studies revealed the heuristic-dependent mechanism to attain optimality while reducing or prioritizing test cases. Nevertheless, those studies were deprived of systematic procedures to manage tied test cases issue. Moreover, evolutionary algorithms such as the genetic process often help in depleting test cases, together with a concurrent decrease in computational runtime. However, when examining the fault detection capacity along with other parameters, is required, the method falls sh
... Show MoreKrawtchouk polynomials (KPs) and their moments are promising techniques for applications of information theory, coding theory, and signal processing. This is due to the special capabilities of KPs in feature extraction and classification processes. The main challenge in existing KPs recurrence algorithms is that of numerical errors, which occur during the computation of the coefficients in large polynomial sizes, particularly when the KP parameter (p) values deviate away from 0.5 to 0 and 1. To this end, this paper proposes a new recurrence relation in order to compute the coefficients of KPs in high orders. In particular, this paper discusses the development of a new algorithm and presents a new mathematical model for computing the
... Show More