Polyaniline Multi wall Carbon nanotube (PANI/MWCNTs) nanocomposite thin films have been prepared by Plasma jet polymerization at low frequency on glass substrate with preliminary deposited aluminum electrodes to form Al/PANI-MWCNT/Al surface-type capacitive humidity sensors, the gap between the electrodes about 50 μm and the MWCNTs weight concentration varied between 0, 1, 2, 3, 4%. The diameter of the MWCNTs was in the range of 8-15 nm and the length 10-55 μm. The capacitance-humidity relationships of the sensors were investigated at humidity levels from 35 to 90% RH. The electrical properties showed that the capacity increased with increasing relative humidity, and that the sensitivity of the sensor increases with the increase of the additive (MWCNTs); while each of the response time and the recovery time increasing with concentration. The change in MWCNTs concentration leads to a change in the energy gap as well as the initial capacity. The capacitance increases linearly with the relative humidity at MWCNTs concentration of 3% for thus the possibility of manufacturing humidity sensor with good specifications at this concentration.
Document clustering is the process of organizing a particular electronic corpus of documents into subgroups of similar text features. Formerly, a number of conventional algorithms had been applied to perform document clustering. There are current endeavors to enhance clustering performance by employing evolutionary algorithms. Thus, such endeavors became an emerging topic gaining more attention in recent years. The aim of this paper is to present an up-to-date and self-contained review fully devoted to document clustering via evolutionary algorithms. It firstly provides a comprehensive inspection to the document clustering model revealing its various components with its related concepts. Then it shows and analyzes the principle research wor
... Show MoreThis study assessed the advantage of using earthworms in combination with punch waste and nutrients in remediating drill cuttings contaminated with hydrocarbons. Analyses were performed on day 0, 7, 14, 21, and 28 of the experiment. Two hydrocarbon concentrations were used (20000 mg/kg and 40000 mg/kg) for three groups of earthworms number which were five, ten and twenty earthworms. After 28 days, the total petroleum hydrocarbon (TPH) concentration (20000 mg/kg) was reduced to 13200 mg/kg, 9800 mg/kg, and 6300 mg/kg in treatments with five, ten and twenty earthworms respectively. Also, TPH concentration (40000 mg/kg) was reduced to 22000 mg/kg, 10100 mg/kg, and 4200 mg/kg in treatments with the above number of earthworms respectively. The p
... Show MoreThe fetal heart rate (FHR) signal processing based on Artificial Neural Networks (ANN),Fuzzy Logic (FL) and frequency domain Discrete Wavelet Transform(DWT) were analysis in order to perform automatic analysis using personal computers. Cardiotocography (CTG) is a primary biophysical method of fetal monitoring. The assessment of the printed CTG traces was based on the visual analysis of patterns that describing the variability of fetal heart rate signal. Fetal heart rate data of pregnant women with pregnancy between 38 and 40 weeks of gestation were studied. The first stage in the system was to convert the cardiotocograghy (CTG) tracing in to digital series so that the system can be analyzed ,while the second stage ,the FHR time series was t
... Show MoreThis paper deals with the design and implementation of an ECG system. The proposed system gives a new concept of ECG signal manipulation, storing, and editing. It consists mainly of hardware circuits and the related software. The hardware includes the circuits of ECG signals capturing, and system interfaces. The software is written using Visual Basic languages, to perform the task of identification of the ECG signal. The main advantage of the system is to provide a reported ECG recording on a personal computer, so that it can be stored and processed at any time as required. This system was tested for different ECG signals, some of them are abnormal and the other is normal, and the results show that the system has a good quality of diagno
... Show MoreThe quality of Global Navigation Satellite Systems (GNSS) networks are considerably influenced by the configuration of the observed baselines. Where, this study aims to find an optimal configuration for GNSS baselines in terms of the number and distribution of baselines to improve the quality criteria of the GNSS networks. First order design problem (FOD) was applied in this research to optimize GNSS network baselines configuration, and based on sequential adjustment method to solve its objective functions.
FOD for optimum precision (FOD-p) was the proposed model which based on the design criteria of A-optimality and E-optimality. These design criteria were selected as objective functions of precision, whic
... Show MoreA novel median filter based on crow optimization algorithms (OMF) is suggested to reduce the random salt and pepper noise and improve the quality of the RGB-colored and gray images. The fundamental idea of the approach is that first, the crow optimization algorithm detects noise pixels, and that replacing them with an optimum median value depending on a criterion of maximization fitness function. Finally, the standard measure peak signal-to-noise ratio (PSNR), Structural Similarity, absolute square error and mean square error have been used to test the performance of suggested filters (original and improved median filter) used to removed noise from images. It achieves the simulation based on MATLAB R2019b and the resul
... Show MoreSpeech is the essential way to interact between humans or between human and machine. However, it is always contaminated with different types of environment noise. Therefore, speech enhancement algorithms (SEA) have appeared as a significant approach in speech processing filed to suppress background noise and return back the original speech signal. In this paper, a new efficient two-stage SEA with low distortion is proposed based on minimum mean square error sense. The estimation of clean signal is performed by taking the advantages of Laplacian speech and noise modeling based on orthogonal transform (Discrete Krawtchouk-Tchebichef transform) coefficients distribution. The Discrete Kra