In data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.
In today's world, most business, regardless of size, believe that access to Internet is imperative if they are going to complete effectively. Yet connecting a private computer (or a network) to the Internet can expose critical or confidential data to malicious attack from anywhere in the world since unprotected connections to the Internet (or any network topology) leaves the user computer vulnerable to hacker attacks and other Internet threats. Therefore, to provide high degree of protection to the network and network's user, Firewall need to be used.
Firewall provides a barrier between the user computer and the Internet (i.e. it prevents unauthor
... Show MoreIn this paper, the human robotic leg which can be represented mathematically by single input-single output (SISO) nonlinear differential model with one degree of freedom, is analyzed and then a simple hybrid neural fuzzy controller is designed to improve the performance of this human robotic leg model. This controller consists from SISO fuzzy proportional derivative (FPD) controller with nine rules summing with single node neural integral derivative (NID) controller with nonlinear function. The Matlab simulation results for nonlinear robotic leg model with the suggested controller showed that the efficiency of this controller when compared with the results of the leg model that is controlled by PI+2D, PD+NID, and F
... Show MoreIn-situ gelation is a process of gel formation at the site of application, in which a drug product formulation that exists as a liquid has been transformed into a gel upon contact with body fluids. As a drug delivery agent, the in-situ gel has an advantage of providing sustained release of the drug agent. In-situ gelling liquid suppositories using poloxamer 188 (26-30% W/W) as a suppository base with 10% W/W naproxen were prepared, the gelation temperature of these preparations were measured and they were all above the physiological temperature. Additives such as polyvinylpyrrolidin "PVP" ,hydroxylpropylmethylcellulose "HPMC", sodium alginate and sodium chloride were used in concentration ranging from (0.25-1
... Show MoreThe parameter and system reliability in stress-strength model are estimated in this paper when the system contains several parallel components that have strengths subjects to common stress in case when the stress and strengths follow Generalized Inverse Rayleigh distribution by using different Bayesian estimation methods. Monte Carlo simulation introduced to compare among the proposal methods based on the Mean squared Error criteria.
Permeability determination in Carbonate reservoir is a complex problem, due to their capability to be tight and heterogeneous, also core samples are usually only available for few wells therefore predicting permeability with low cost and reliable accuracy is an important issue, for this reason permeability predictive models become very desirable.
This paper will try to develop the permeability predictive model for one of Iraqi carbonate reservoir from core and well log data using the principle of Hydraulic Flow Units (HFUs). HFU is a function of Flow Zone Indicator (FZI) which is a good parameter to determine (HFUs).
Histogram analysis, probability analysis and Log-Log plot of Reservoir Qua
... Show MoreOpenStreetMap (OSM) represents the most common example of online volunteered mapping applications. Most of these platforms are open source spatial data collected by non-experts volunteers using different data collection methods. OSM project aims to provide a free digital map for all the world. The heterogeneity in data collection methods made OSM project databases accuracy is unreliable and must be dealt with caution for any engineering application. This study aims to assess the horizontal positional accuracy of three spatial data sources are OSM road network database, high-resolution Satellite Image (SI), and high-resolution Aerial Photo (AP) of Baghdad city with respect to an analogue formal road network dataset obtain
... Show MoreIn this research, we dealt with the study of the Non-Homogeneous Poisson process, which is one of the most important statistical issues that have a role in scientific development as it is related to accidents that occur in reality, which are modeled according to Poisson’s operations, because the occurrence of this accident is related to time, whether with the change of time or its stability. In our research, this clarifies the Non-Homogeneous hemispheric process and the use of one of these models of processes, which is an exponentiated - Weibull model that contains three parameters (α, β, σ) as a function to estimate the time rate of occurrence of earthquakes in Erbil Governorate, as the governorate is adjacent to two countr
... Show MoreToday, there are large amounts of geospatial data available on the web such as Google Map (GM), OpenStreetMap (OSM), Flickr service, Wikimapia and others. All of these services called open source geospatial data. Geospatial data from different sources often has variable accuracy due to different data collection methods; therefore data accuracy may not meet the user requirement in varying organization. This paper aims to develop a tool to assess the quality of GM data by comparing it with formal data such as spatial data from Mayoralty of Baghdad (MB). This tool developed by Visual Basic language, and validated on two different study areas in Baghdad / Iraq (Al-Karada and Al- Kadhumiyah). The positional accuracy was asses
... Show MoreThe cinematic story depends on many construction techniques that together constitute the story features technically and the secondary events are considered one of these basic techniques that are directly affected by the employment mechanisms inside the cinematic achievement. This subject initiated the two researchers to decide the title of the research: (Mechanisms of Employing Secondary Event in Cinematographic Discourse). The research is divided into an introduction that included the problem details, the aim and defining the terms used. The first section was the act and the event in the cinematic story, which addressed the relation between the act and the event and the nature of the simulation that tries to ascend the human act
... Show More