A particle swarm optimization algorithm and neural network like self-tuning PID controller for CSTR system is presented. The scheme of the discrete-time PID control structure is based on neural network and tuned the parameters of the PID controller by using a particle swarm optimization PSO technique as a simple and fast training algorithm. The proposed method has advantage that it is not necessary to use a combined structure of identification and decision because it used PSO. Simulation results show the effectiveness of the proposed adaptive PID neural control algorithm in terms of minimum tracking error and smoothness control signal obtained for non-linear dynamical CSTR system.
The investigation of signature validation is crucial to the field of personal authenticity. The biometrics-based system has been developed to support some information security features.Aperson’s signature, an essential biometric trait of a human being, can be used to verify their identification. In this study, a mechanism for automatically verifying signatures has been suggested. The offline properties of handwritten signatures are highlighted in this study which aims to verify the authenticity of handwritten signatures whether they are real or forged using computer-based machine learning techniques. The main goal of developing such systems is to verify people through the validity of their signatures. In this research, images of a group o
... Show MoreIn this research Artificial Neural Network (ANN) technique was applied to study the filtration process in water treatment. Eight models have been developed and tested using data from a pilot filtration plant, working under different process design criteria; influent turbidity, bed depth, grain size, filtration rate and running time (length of the filtration run), recording effluent turbidity and head losses. The ANN models were constructed for the prediction of different performance criteria in the filtration process: effluent turbidity, head losses and running time. The results indicate that it is quite possible to use artificial neural networks in predicting effluent turbidity, head losses and running time in the filtration process, wi
... Show MoreThe current problem is summarized in what is called the development failing experience
in comprehencing the studying materials , so the students will feel worry of repeating failure
in he future , so he would seek blind keeping on heart for the studying material bond this isbad due to the forgetting in the future , one side of thesis research problem is that there is
many contradictory researches result in relation to the learning styles which impose the
nessicity to find results lessen this contradiction . the importance of the research is
summarized in the importance of the subject under the study , in that the researcher ( as in
her knowledge ) did not find a thesrs tackling the subject of the distinguished students
This paper presents a nonlinear finite element modeling and analysis of steel fiber reinforced concrete (SFRC) deep beams with and without openings in web subjected to two- point loading. In this study, the beams were modeled using ANSYS nonlinear finite element
software. The percentage of steel fiber was varied from 0 to 1.0%.The influence of fiber content in the concrete deep beams has been studied by measuring the deflection of the deep beams at mid- span and marking the cracking patterns, compute the failure loads for each deep beam, and also study the shearing and first principal stresses for the deep beams with and without openings and with different steel fiber ratios. The above study indicates that the location of openings an
Image compression is one of the data compression types applied to digital images in order to reduce their high cost for storage and/or transmission. Image compression algorithms may take the benefit of visual sensitivity and statistical properties of image data to deliver superior results in comparison with generic data compression schemes, which are used for other digital data. In the first approach, the input image is divided into blocks, each of which is 16 x 16, 32 x 32, or 64 x 64 pixels. The blocks are converted first into a string; then, encoded by using a lossless and dictionary-based algorithm known as arithmetic coding. The more occurrence of the pixels values is codded in few bits compare with pixel values of less occurre
... Show MoreAbstract
This research aim to overcome the problem of dimensionality by using the methods of non-linear regression, which reduces the root of the average square error (RMSE), and is called the method of projection pursuit regression (PPR), which is one of the methods for reducing dimensions that work to overcome the problem of dimensionality (curse of dimensionality), The (PPR) method is a statistical technique that deals with finding the most important projections in multi-dimensional data , and With each finding projection , the data is reduced by linear compounds overall the projection. The process repeated to produce good projections until the best projections are obtained. The main idea of the PPR is to model
... Show MoreObjective: The aim of this study is to find out the impact of life events upon onset of depression, to describe the
prevalence of life events among depressed patients.
Methodology: Retrospective a case-control study conducted in AL-Diwanyia Teaching Hospital, Psychiatric
Department on A non-probability (purposive sample) of (60) depressed patients and (60) of healthy person were matched
with them from general population. The data were collected through the use of semi-structured interview by
questionnaire, which consists of two parts (1) divide, section A. cover letter and B. Sociodemographic data which consists
of 9-items, (2) Life events questionnaire consists of 51-items distributed to six dimensions include, family