The agent-based modeling is currently utilized extensively to analyze complex systems. It supported such growth, because it was able to convey distinct levels of interaction in a complex detailed environment. Meanwhile, agent-based models incline to be progressively complex. Thus, powerful modeling and simulation techniques are needed to address this rise in complexity. In recent years, a number of platforms for developing agent-based models have been developed. Actually, in most of the agents, often discrete representation of the environment, and one level of interaction are presented, where two or three are regarded hardly in various agent-based models. The key issue is that modellers work in these areas is not assisted by simulation platforms. Therefore, GAMA modelling and simulation platform is designed to facilitate the development of spatialized, multi-paradigms and multi-scale models to address this challenge.This platform enables modelers to build explicit and multi-level models geographically (spatially). It includes notably effective Data Mining models and Geographic Information Systems (GIS) that make the effort to model and analyze easier. This study examines how this platform deals with these concerns and how modeler's tools are provided from the box. Also, shows its skills relating to the tight combination of 3D visualization, GIS data management, and multi-level modeling. Furthermore, several examples of GAMA-based projects that had been built for complex models.
In this research, we use fuzzy nonparametric methods based on some smoothing techniques, were applied to real data on the Iraqi stock market especially the data about Baghdad company for soft drinks for the year (2016) for the period (1/1/2016-31/12/2016) .A sample of (148) observations was obtained in order to construct a model of the relationship between the stock prices (Low, high, modal) and the traded value by comparing the results of the criterion (G.O.F.) for three techniques , we note that the lowest value for this criterion was for the K-Nearest Neighbor at Gaussian function .
Objectives Bromelain is a potent proteolytic enzyme that has a unique functionality makes it valuable for various therapeutic purposes. This study aimed to develop three novel formulations based on bromelain to be used as chemomechanical caries removal agents. Methods The novel agents were prepared using different concentrations of bromelain (10–40 wt. %), with and without 0.1–0.3 wt. % chloramine T or 0.5–1.5 wt. % chlorhexidine (CHX). Based on the enzymatic activity test, three formulations were selected; 30 % bromelain (F1), 30 % bromelain-0.1 % chloramine (F2) and 30 % bromelain-1.5 % CHX (F3). The assessments included molecular docking, Fourier-transform infrared spectroscopy (FTIR), viscosity and pH measurements. The efficiency
... Show MoreDiagnosing heart disease has become a very important topic for researchers specializing in artificial intelligence, because intelligence is involved in most diseases, especially after the Corona pandemic, which forced the world to turn to intelligence. Therefore, the basic idea in this research was to shed light on the diagnosis of heart diseases by relying on deep learning of a pre-trained model (Efficient b3) under the premise of using the electrical signals of the electrocardiogram and resample the signal in order to introduce it to the neural network with only trimming processing operations because it is an electrical signal whose parameters cannot be changed. The data set (China Physiological Signal Challenge -cspsc2018) was ad
... Show MoreIn regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement
... Show MoreImage compression plays an important role in reducing the size and storage of data while increasing the speed of its transmission through the Internet significantly. Image compression is an important research topic for several decades and recently, with the great successes achieved by deep learning in many areas of image processing, especially image compression, and its use is increasing Gradually in the field of image compression. The deep learning neural network has also achieved great success in the field of processing and compressing various images of different sizes. In this paper, we present a structure for image compression based on the use of a Convolutional AutoEncoder (CAE) for deep learning, inspired by the diversity of human eye
... Show MoreObjectives: Bromelain is a potent proteolytic enzyme that has a unique functionality makes it valuable for various therapeutic purposes. This study aimed to develop three novel formulations based on bromelain to be used as chemomechanical caries removal agents. Methods: The novel agents were prepared using different concentrations of bromelain (10–40 wt. %), with and without 0.1–0.3 wt. % chloramine T or 0.5–1.5 wt. % chlorhexidine (CHX). Based on the enzymatic activity test, three formulations were selected; 30 % bromelain (F1), 30 % bromelain-0.1 % chloramine (F2) and 30 % bromelain-1.5 % CHX (F3). The assessments included molecular docking, Fourier-transform infrared spectroscopy (FTIR), viscosity and pH measurements. The efficie
... Show MoreProducts’ quality inspection is an important stage in every production route, in which the quality of the produced goods is estimated and compared with the desired specifications. With traditional inspection, the process rely on manual methods that generates various costs and large time consumption. On the contrary, today’s inspection systems that use modern techniques like computer vision, are more accurate and efficient. However, the amount of work needed to build a computer vision system based on classic techniques is relatively large, due to the issue of manually selecting and extracting features from digital images, which also produces labor costs for the system engineers. In this research, we pr
... Show MoreIn recent years, there has been expanding development in the vehicular part and the number of vehicles moving on the road in all the sections of the country. Vehicle number plate identification based on image processing is a dynamic area of this work; this technique is used for security purposes such as tracking of stolen cars and access control to restricted areas. The License Plate Recognition System (LPRS) exploits a digital camera to capture vehicle plate numbers is used as input to the proposed recognition system. Basically, the developing system is consist of three phases, vehicle license plate localization, character segmentation, and character recognition, the License Plate (LP) detection is presented using canny Edge detection algo
... Show MoreThe subject of an valuation of quality of construction projects is one of the topics which it becomes necessary of the absence of the quantity standards in measuring the control works and the quality valuation standards in constructional projects. In the time being it depends on the experience of the workers which leads to an apparent differences in the valuation.
The idea of this research came to put the standards to evaluate the quality of the projects in a special system depending on quantity scale nor quality specifying in order to prepare an expert system “ Crystal “ to apply this special system to able the engineers to valuate the quality of their projects easily and in more accurate ways.