Document clustering is the process of organizing a particular electronic corpus of documents into subgroups of similar text features. Formerly, a number of conventional algorithms had been applied to perform document clustering. There are current endeavors to enhance clustering performance by employing evolutionary algorithms. Thus, such endeavors became an emerging topic gaining more attention in recent years. The aim of this paper is to present an up-to-date and self-contained review fully devoted to document clustering via evolutionary algorithms. It firstly provides a comprehensive inspection to the document clustering model revealing its various components with its related concepts. Then it shows and analyzes the principle research work in this topic. Finally, it compiles and classifies various objective functions, the core of the evolutionary algorithms, from the related collection of research papers. The paper ends up by addressing some important issues and challenges that can be subject of future work.
The aim of this essay is to use a single-index model in developing and adjusting Fama-MacBeth. Penalized smoothing spline regression technique (SIMPLS) foresaw this adjustment. Two generalized cross-validation techniques, Generalized Cross Validation Grid (GGCV) and Generalized Cross Validation Fast (FGCV), anticipated the regular value of smoothing covered under this technique. Due to the two-steps nature of the Fama-MacBeth model, this estimation generated four estimates: SIMPLS(FGCV) - SIMPLS(FGCV), SIMPLS(FGCV) - SIM PLS(GGCV), SIMPLS(GGCV) - SIMPLS(FGCV), SIM PLS(GGCV) - SIM PLS(GGCV). Three-factor Fama-French model—market risk premium, size factor, value factor, and their implication for excess stock returns and portfolio return
... Show MoreThe researchers of the present study have conducted a genre analysis of two political debates between American presidential nominees in the 2016 and 2020 elections. The current study seeks to analyze the cognitive construction of political debates to evaluate the typical moves and strategies politicians use to express their communicative intentions and to reveal the language manifestations of those moves and strategies. To achieve the study’s aims, the researchers adopt Bhatia’s (1993) framework of cognitive construction supported by van Emeren’s (2010) pragma-dialectic framework. The study demonstrates that both presidents adhere to this genre structuring to further their political agendas. For a positive and promising image
... Show MoreThe present study discusses the problem based learning in Iraqi classroom. This method aims to involve all learners in collaborative activities and it is learner-centered method. To fulfill the aims and verify the hypothesis which reads as follow” It is hypothesized that there is no statistically significant differences between the achievements of Experimental group and control group”. Thirty learners are selected to be the sample of present study.Mann-Whitney Test for two independent samples is used to analysis the results. The analysis shows that experimental group’s members who are taught according to problem based learning gets higher scores than the control group’s members who are taught according to traditional method. This
... Show MoreThe development of Web 2.0 has improved people's ability to share their opinions. These opinions serve as an important piece of knowledge for other reviewers. To figure out what the opinions is all about, an automatic system of analysis is needed. Aspect-based sentiment analysis is the most important research topic conducted to extract reviewers-opinions about certain attribute, for instance opinion-target (aspect). In aspect-based tasks, the identification of the implicit aspect such as aspects implicitly implied in a review, is the most challenging task to accomplish. However, this paper strives to identify the implicit aspects based on hierarchical algorithm incorporated with common-sense knowledge by means of dimensionality reduction.
Digital Elevation Model (DEM) is one of the developed techniques for relief representation. The definition of a DEM construction is the modeling technique of earth surface from existing data. DEM plays a role as one of the fundamental information requirement that has been generally utilized in GIS data structures. The main aim of this research is to present a methodology for assessing DEMs generation methods. The DEMs data will be extracted from open source data e.g. Google Earth. The tested data will be compared with data produced from formal institutions such as General Directorate of Surveying. The study area has been chosen in south of Iraq (Al-Gharraf / Dhi Qar governorate. The methods of DEMs creation are kriging, IDW (inver
... Show MoreResearchers employ behavior based malware detection models that depend on API tracking and analyzing features to identify suspected PE applications. Those malware behavior models become more efficient than the signature based malware detection systems for detecting unknown malwares. This is because a simple polymorphic or metamorphic malware can defeat signature based detection systems easily. The growing number of computer malwares and the detection of malware have been the concern for security researchers for a large period of time. The use of logic formulae to model the malware behaviors is one of the most encouraging recent developments in malware research, which provides alternatives to classic virus detection methods. To address the l
... Show More