Nowadays, it is convenient for us to use a search engine to get our needed information. But sometimes it will misunderstand the information because of the different media reports. The Recommender System (RS) is popular to use for every business since it can provide information for users that will attract more revenues for companies. But also, sometimes the system will recommend unneeded information for users. Because of this, this paper provided an architecture of a recommender system that could base on user-oriented preference. This system is called UOP-RS. To make the UOP-RS significantly, this paper focused on movie theatre information and collect the movie database from the IMDb website that provides information related to movies, television programs, home videos, video games, and streaming content that also collects many ratings and reviews from users. This paper also analyzed individual user data to extract the user’s features. Based on user characteristics, movie ratings/scores, and movie results, a UOP-RS model was built. In our experiment, 5000 IMDb movie datasets were used and 5 recommended movies for users. The results show that the system could return results on 3.86 s and has a 14% error on recommended goods when training data as . At the end of this paper concluded that the system could quickly recommend users of the goods which they needed. The proposed system will extend to connect with the Chatbot system that users can make queries faster and easier from their phones in the future.
In Australia, most of the existing buildings were designed before the release of the Australian standard for earthquake actions in 2007. Therefore, many existing buildings in Australia lack adequate seismic design, and their seismic performance must be assessed. The recent earthquake that struck Mansfield, Victoria near Melbourne elevated the need to produce fragility curves for existing reinforced concrete (RC) buildings in Australia. Fragility curves are frequently utilized to assess buildings’ seismic performance and it is defined as the demand probability surpassing capacity at a given intensity level. Numerous factors can influence the results of the fragility assessment of RC buildings. Among the most important factors that can affe
... Show MoreThis research examines the quantitative analysis to assess the efficiency of the transport network in Sadr City, where the study area suffers from a large traffic movement for the variability of traffic flow and intensity at peak hours as a result of inside traffic and outside of it, especially in the neighborhoods of population with economic concentration. &n
... Show MoreAbstract
Hexapod robot is a flexible mechanical robot with six legs. It has the ability to walk over terrain. The hexapod robot look likes the insect so it has the same gaits. These gaits are tripod, wave and ripple gaits. Hexapod robot needs to stay statically stable at all the times during each gait in order not to fall with three or more legs continuously contacts with the ground. The safety static stability walking is called (the stability margin). In this paper, the forward and inverse kinematics are derived for each hexapod’s leg in order to simulate the hexapod robot model walking using MATLAB R2010a for all gaits and the geometry in order to derive the equations of the sub-constraint workspaces for each
... Show MoreCloud Computing is a mass platform to serve high volume data from multi-devices and numerous technologies. Cloud tenants have a high demand to access their data faster without any disruptions. Therefore, cloud providers are struggling to ensure every individual data is secured and always accessible. Hence, an appropriate replication strategy capable of selecting essential data is required in cloud replication environments as the solution. This paper proposed a Crucial File Selection Strategy (CFSS) to address poor response time in a cloud replication environment. A cloud simulator called CloudSim is used to conduct the necessary experiments, and results are presented to evidence the enhancement on replication performance. The obtained an
... Show MoreSolid waste is a major issue in today's world. Which can be a contributing factor to pollution and the spread of vector-borne diseases. Because of its complicated nonlinear processes, this problem is difficult to model and optimize using traditional methods. In this study, a mathematical model was developed to optimize the cost of solid waste recycling and management. In the optimization phase, the salp swarm algorithm (SSA) is utilized to determine the level of discarded solid waste and reclaimed solid waste. An optimization technique SSA is a new method of finding the ideal solution for a mathematical relationship based on leaders and followers. It takes a lot of random solutions, as well as their outward or inward fluctuations, t
... Show MoreThe aim of this paper is to approximate multidimensional functions by using the type of Feedforward neural networks (FFNNs) which is called Greedy radial basis function neural networks (GRBFNNs). Also, we introduce a modification to the greedy algorithm which is used to train the greedy radial basis function neural networks. An error bound are introduced in Sobolev space. Finally, a comparison was made between the three algorithms (modified greedy algorithm, Backpropagation algorithm and the result is published in [16]).
Data centric techniques, like data aggregation via modified algorithm based on fuzzy clustering algorithm with voronoi diagram which is called modified Voronoi Fuzzy Clustering Algorithm (VFCA) is presented in this paper. In the modified algorithm, the sensed area divided into number of voronoi cells by applying voronoi diagram, these cells are clustered by a fuzzy C-means method (FCM) to reduce the transmission distance. Then an appropriate cluster head (CH) for each cluster is elected. Three parameters are used for this election process, the energy, distance between CH and its neighbor sensors and packet loss values. Furthermore, data aggregation is employed in each CH to reduce the amount of data transmission which le
... Show MoreThis work presents the simulation of a Low density Parity Check (LDPC) coding scheme with
multiuserMulti-Carrier Code Division Multiple Access (MC-CDMA) system over Additive White
Gaussian Noise (AWGN) channel and multipath fading channels. The decoding technique used in
the simulation was iterative decoding since it gives maximum efficiency with ten iterations.
Modulation schemes that used are Phase Shift Keying (BPSK, QPSK and 16 PSK), along with the
Orthogonal Frequency Division Multiplexing (OFDM). A 12 pilot carrier were used in the estimator
to compensate channel effect. The channel model used is Long Term Evolution (LTE) channel with
Technical Specification TS 25.101v2.10 and 5 MHz bandwidth including the chan
In this paper a decoder of binary BCH code is implemented using a PIC microcontroller for code length n=127 bits with multiple error correction capability, the results are presented for correcting errors up to 13 errors. The Berkelam-Massey decoding algorithm was chosen for its efficiency. The microcontroller PIC18f45k22 was chosen for the implementation and programmed using assembly language to achieve highest performance. This makes the BCH decoder implementable as a low cost module that can be used as a part of larger systems. The performance evaluation is presented in terms of total number of instructions and the bit rate.
Coronavirus disease (Covid-19) has threatened human life, so it has become necessary to study this disease from many aspects. This study aims to identify the nature of the effect of interdependence between these countries and the impact of each other on each other by designating these countries as heads for the proposed graph and measuring the distance between them using the ultrametric spanning tree. In this paper, a network of countries in the Middle East is described using the tools of graph theory.