Recurrent strokes can be devastating, often resulting in severe disability or death. However, nearly 90% of the causes of recurrent stroke are modifiable, which means recurrent strokes can be averted by controlling risk factors, which are mainly behavioral and metabolic in nature. Thus, it shows that from the previous works that recurrent stroke prediction model could help in minimizing the possibility of getting recurrent stroke. Previous works have shown promising results in predicting first-time stroke cases with machine learning approaches. However, there are limited works on recurrent stroke prediction using machine learning methods. Hence, this work is proposed to perform an empirical analysis and to investigate machine learning algorithms implementation in the recurrent stroke prediction models. This research aims to investigate and compare the performance of machine learning algorithms using recurrent stroke clinical public datasets. In this study, Artificial Neural Network (ANN), Support Vector Machine (SVM) and Bayesian Rule List (BRL) are used and compared their performance in the domain of recurrent stroke prediction model. The result of the empirical experiments shows that ANN scores the highest accuracy at 80.00%, follows by BRL with 75.91% and SVM with 60.45%.
Optimization of gas lift plays a substantial role in production and maximizing the net present value of the investment of oil field projects. However, the application of the optimization techniques in gas lift project is so complex because many decision variables, objective functions and constraints are involved in the gas lift optimization problem. In addition, many computational ways; traditional and modern, have been employed to optimize gas lift processes. This research aims to present the developing of the optimization techniques applied in the gas lift. Accordingly, the research classifies the applied optimization techniques, and it presents the limitations and the range of applications of each one to get an acceptable level of accura
... Show MoreIn this paper, some basic notions and facts in the b-modular space similar to those in the modular spaces as a type of generalization are given. For example, concepts of convergence, best approximate, uniformly convexity etc. And then, two results about relation between semi compactness and approximation are proved which are used to prove a theorem on the existence of best approximation for a semi-compact subset of b-modular space.
Wireless sensor applications are susceptible to energy constraints. Most of the energy is consumed in communication between wireless nodes. Clustering and data aggregation are the two widely used strategies for reducing energy usage and increasing the lifetime of wireless sensor networks. In target tracking applications, large amount of redundant data is produced regularly. Hence, deployment of effective data aggregation schemes is vital to eliminate data redundancy. This work aims to conduct a comparative study of various research approaches that employ clustering techniques for efficiently aggregating data in target tracking applications as selection of an appropriate clustering algorithm may reflect positive results in the data aggregati
... Show MoreThe present study aims at examining quantitatively the morphometric characteristics of Iziana Valley basin that is located in the northern part of Iraq; particularly in south of Erbil Governorate. This basin is considered one of the small sub-basins where its valleys run on formations of the Triple and Quadrant Ages, which are represented by the Bay Hassan formations, and the sediments and mixed sediments of the cliffs, respectively. The area of the Iziana basin amounts to (36.39 km2) whereas the percentage of its rotation reaches (0.17); a low percentage, which indicates that the basin diverges from the circular to the rectangular shape. The value of the elongation ratio of the basin reaches (0.38) while the terrain rat
... Show MoreThe traditional centralized network management approach presents severe efficiency and scalability limitations in large scale networks. The process of data collection and analysis typically involves huge transfers of management data to the manager which cause considerable network throughput and bottlenecks at the manager side. All these problems processed using the Agent technology as a solution to distribute the management functionality over the network elements. The proposed system consists of the server agent that is working together with clients agents to monitor the logging (off, on) of the clients computers and which user is working on it. file system watcher mechanism is used to indicate any change in files. The results were presente
... Show MoreThis paper introduces a relationship between the independence of polynomials associated with the links of the network, and the Jacobian determinant of these polynomials. Also, it presents a way to simplify a given communication network through an algorithm that splits the network into subnets and reintegrates them into a network that is a general representation or model of the studied network. This model is also represented through a combination of polynomial equations and uses Groebner bases to reach a new simplified network equivalent to the given network, which may make studying the ability to solve the problem of network coding less expensive and much easier.
Abstract :
The Aims of this research is to describe the concept of risk, its type and method of measurement, and to clarify the impact of these risks on the expected cash flow statement and the preparation of the target cash flow statement that takes these risks into consideration. Because the local economic environment is exposed to many risks, Therefore, this list will be predictive, which will help the economic unit to make administrative decisions, especially decisions related to operational, investment and financing activities. Therefore, the research problem is based on the fact that most of the local economic units are the list of flows According to the actual basis and not according to the discretionary basis (bud
... Show MoreDiscriminant between groups is one of the common procedures because of its ability to analyze many practical phenomena, and there are several methods can be used for this purpose, such as linear and quadratic discriminant functions. recently, neural networks is used as a tool to distinguish between groups.
In this paper the simulation is used to compare neural networks and classical method for classify observations to group that is belong to, in case of some variables that don’t follow the normal distribution. we use the proportion of number of misclassification observations to the all observations as a criterion of comparison.
لقد كان للثورة الرقمية التي ظهرت في القرن العشرين أثر في إحداث تأثيرات جذرية تضمنت نواحي الحياة المختلفة، خصوصًا في المجال الإقتصادي، والتي تمثلت بثلاث صور ( الذكاء الإصطناعيArtificial Intelligence( AI) وإنترنت الأشياء Internet of Things والبيانات الضخمة Big Data ، وفيما يتعلق بالذكاء الإصطناعي، فقد تم إكتشافهُ في منتصف خمسينات القرن الماضي الذي تعد الولادة الحقيقية لهُ في المؤتمر الذي نُظم في الولايات المتحدة الأمريكية على يد
... Show MorePsychological damage is one of the damages that can be compensated under the fault of negligence in the framework of English law, where the latter intends to include an enumeration of civil errors on the basis of which liability can be determined, and aims under each of these errors to protect a specific interest (for example, defamation protects Among the damage to reputation and inconvenience are the rights contained on the land), and the same is true for the rest of the other errors. Compensation for psychological damage resulting from negligence has raised problems in cases where the psychological injury is "pure", that is, those that are not accompanied by a physical injury, which required subjecting them to special requirements by the
... Show More