Recommendation systems are now being used to address the problem of excess information in several sectors such as entertainment, social networking, and e-commerce. Although conventional methods to recommendation systems have achieved significant success in providing item suggestions, they still face many challenges, including the cold start problem and data sparsity. Numerous recommendation models have been created in order to address these difficulties. Nevertheless, including user or item-specific information has the potential to enhance the performance of recommendations. The ConvFM model is a novel convolutional neural network architecture that combines the capabilities of deep learning for feature extraction with the effectiveness of factorization machines for recommendation tasks. The present work introduces a novel hybrid deep factorization machine (FM) model, referred to as ConvFM. The ConvFM model use a combination of feature extraction and convolutional neural networks (CNNs) to extract features from both individuals and things, namely movies. Following this, the proposed model employs a methodology known as factorization machines, which use the FM algorithm. The focus of the CNN is on the extraction of features, which has resulted in a notable improvement in performance. In order to enhance the accuracy of predictions and address the challenges posed by sparsity, the proposed model incorporates both the extracted attributes and explicit interactions between items and users. This paper presents the experimental procedures and outcomes conducted on the Movie Lens dataset. In this discussion, we engage in an analysis of our research outcomes followed by provide recommendations for further action.
The present study discusses the problem based learning in Iraqi classroom. This method aims to involve all learners in collaborative activities and it is learner-centered method. To fulfill the aims and verify the hypothesis which reads as follow” It is hypothesized that there is no statistically significant differences between the achievements of Experimental group and control group”. Thirty learners are selected to be the sample of present study.Mann-Whitney Test for two independent samples is used to analysis the results. The analysis shows that experimental group’s members who are taught according to problem based learning gets higher scores than the control group’s members who are taught according to traditional method. This
... Show MoreThe aim of this paper is to present a new methodology to find the private key of RSA. A new initial value which is generated from a new equation is selected to speed up the process. In fact, after this value is found, brute force attack is chosen to discover the private key. In addition, for a proposed equation, the multiplier of Euler totient function to find both of the public key and the private key is assigned as 1. Then, it implies that an equation that estimates a new initial value is suitable for the small multiplier. The experimental results show that if all prime factors of the modulus are assigned larger than 3 and the multiplier is 1, the distance between an initial value and the private key
... Show MoreCrime is considered as an unlawful activity of all kinds and it is punished by law. Crimes have an impact on a society's quality of life and economic development. With a large rise in crime globally, there is a necessity to analyze crime data to bring down the rate of crime. This encourages the police and people to occupy the required measures and more effectively restricting the crimes. The purpose of this research is to develop predictive models that can aid in crime pattern analysis and thus support the Boston department's crime prevention efforts. The geographical location factor has been adopted in our model, and this is due to its being an influential factor in several situations, whether it is traveling to a specific area or livin
... Show MoreIn regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement
... Show MoreMetaheuristics under the swarm intelligence (SI) class have proven to be efficient and have become popular methods for solving different optimization problems. Based on the usage of memory, metaheuristics can be classified into algorithms with memory and without memory (memory-less). The absence of memory in some metaheuristics will lead to the loss of the information gained in previous iterations. The metaheuristics tend to divert from promising areas of solutions search spaces which will lead to non-optimal solutions. This paper aims to review memory usage and its effect on the performance of the main SI-based metaheuristics. Investigation has been performed on SI metaheuristics, memory usage and memory-less metaheuristics, memory char
... Show MoreIn this research a recent developed practical modeling technique is applied for the glucose regulation system identification. By using this technique a set of mathematical models is obtained instead of single one to compensate for the loss of information caused by the optimization technique in curve fitting algorithms, the diversity of members inside the single set is interpreted in term of restricted range of its parameters, also a diagnosis criteria is developed for detecting any disorder in the glucose regulation system by investigating the influence of variation of the parameters on the response of the system, this technique is applied in this research practically for 20 cases with association of National Center for
... Show MoreConsistent "with the thought of tax talk is unified tax natural evolution for him, as the application leads to the inclusion of tax all branches of income and its sources and through truncated part of this entry through the application of price ascending it, it means the procedures of tax reform. Taxes on total income characterized by giving a clear picture of the total income of the taxpayer and its financial situation and its burden family which allows granting exemptions, downloads, and application of prices that fit this case. This requires reconsideration of the structure of the tax system in force and the transition from a system specific taxes to the tax system on the total income of the integration of income from the rental of re
... Show MoreAbstract
The research the impact of the application of some of the production system tools in the specified time, which can be adapted in the service sectors (banking sector) over the improvement and increase the quality of banking services, and highlights the research problem in the low quality of banking services provided to customers because of the reliance on traditional banking systems in the provision of services Because of the lack keep pace with global developments in the banking industry, and the goal of research is to clarify the applicability of the production system in the time specified in the service sector and th
... Show MoreHeart disease is a significant and impactful health condition that ranks as the leading cause of death in many countries. In order to aid physicians in diagnosing cardiovascular diseases, clinical datasets are available for reference. However, with the rise of big data and medical datasets, it has become increasingly challenging for medical practitioners to accurately predict heart disease due to the abundance of unrelated and redundant features that hinder computational complexity and accuracy. As such, this study aims to identify the most discriminative features within high-dimensional datasets while minimizing complexity and improving accuracy through an Extra Tree feature selection based technique. The work study assesses the efficac
... Show More