The fetal heart rate (FHR) signal processing based on Artificial Neural Networks (ANN),Fuzzy Logic (FL) and frequency domain Discrete Wavelet Transform(DWT) were analysis in order to perform automatic analysis using personal computers. Cardiotocography (CTG) is a primary biophysical method of fetal monitoring. The assessment of the printed CTG traces was based on the visual analysis of patterns that describing the variability of fetal heart rate signal. Fetal heart rate data of pregnant women with pregnancy between 38 and 40 weeks of gestation were studied. The first stage in the system was to convert the cardiotocograghy (CTG) tracing in to digital series so that the system can be analyzed ,while the second stage ,the FHR time series was transformed using transform domains Discrete Wavelet Transform(DWT) in order to obtain the system features .At the last stage the approximation coefficients result from the Discrete Wavelet Transform were fed to the Artificial Neural Networks and to the Fuzzy Logic, then compared between two results to obtain the best for classifying fetal heart rate.
CO2 Gas is considered one of the unfavorable gases and it causes great air pollution. It’s possible to decrease this pollution by injecting gas in the oil reservoirs to provide a good miscibility and to increase the oil recovery factor. MMP was estimated by Peng Robinson equation of state (PR-EOS). South Rumila-63 (SULIAY) is involved for which the miscible displacement by is achievable based on the standard criteria for success EOR processes. A PVT report was available for the reservoir under study. It contains deferential liberation (DL) and constant composition expansion (CCE) tests. PVTi software is one of the (Eclipse V.2010) software’s packages, it has been used to achieve the goal. Many trials have been done to ma
... Show MoreThis paper proposes and tests a computerized approach for constructing a 3D model of blood vessels from angiogram images. The approach is divided into two steps, image features extraction and solid model formation. In the first step, image morphological operations and post-processing techniques are used for extracting geometrical entities from the angiogram image. These entities are the middle curve and outer edges of the blood vessel, which are then passed to a computer-aided graphical system for the second phase of processing. The system has embedded programming capabilities and pre-programmed libraries for automating a sequence of events that are exploited to create a solid model of the blood vessel. The gradient of the middle c
... Show MoreElectrospun nanofiber membranes are employed in a variety of applications due to its unique features. the nanofibers' characterizations are effected by the polymer solution. The used solvent for dissolving the polymer powder is critical in preparing the precursor solution. In this paper, the Polyacrylonitrile (PAN)-based nanofibers were prepared in a concentration of 10 wt.% using various solvents (NMP, DMF, and DMSO). The surface morphology, porosity, and the mechanical strength of the three prepared 10 wt.% PAN-based nanofibers membranes (PAN/NMP, PAN/DMF, and PAN/DMSO) were characterized using the Scanning Electron Microscopy (SEM), Dry-wet Weights method, and Dynamic Mechanical Analyzer (DMA). Using DMF as a solvent resulted in a lon
... Show MoreSensitive information of any multimedia must be encrypted before transmission. The dual chaotic algorithm is a good option to encrypt sensitive information by using different parameters and different initial conditions for two chaotic maps. A dual chaotic framework creates a complex chaotic trajectory to prevent the illegal use of information from eavesdroppers. Limited precisions of a single chaotic map cause a degradation in the dynamical behavior of the communication system. To overcome this degradation issue in, a novel form of dual chaos map algorithm is analyzed. To maintain the stability of the dynamical system, the Lyapunov Exponent (LE) is determined for the single and dual maps. In this paper, the LE of the single and dual maps
... Show MoreEstimation of the unknown parameters in 2-D sinusoidal signal model can be considered as important and difficult problem. Due to the difficulty to find estimate of all the parameters of this type of models at the same time, we propose sequential non-liner least squares method and sequential robust M method after their development through the use of sequential approach in the estimate suggested by Prasad et al to estimate unknown frequencies and amplitudes for the 2-D sinusoidal compounds but depending on Downhill Simplex Algorithm in solving non-linear equations for the purpose of obtaining non-linear parameters estimation which represents frequencies and then use of least squares formula to estimate
... Show MoreImage databases are increasing exponentially because of rapid developments in social networking and digital technologies. To search these databases, an efficient search technique is required. CBIR is considered one of these techniques. This paper presents a multistage CBIR to address the computational cost issues while reasonably preserving accuracy. In the presented work, the first stage acts as a filter that passes images to the next stage based on SKTP, which is the first time used in the CBIR domain. While in the second stage, LBP and Canny edge detectors are employed for extracting texture and shape features from the query image and images in the newly constructed database. The p
This study proposed a biometric-based digital signature scheme proposed for facial recognition. The scheme is designed and built to verify the person’s identity during a registration process and retrieve their public and private keys stored in the database. The RSA algorithm has been used as asymmetric encryption method to encrypt hashes generated for digital documents. It uses the hash function (SHA-256) to generate digital signatures. In this study, local binary patterns histograms (LBPH) were used for facial recognition. The facial recognition method was evaluated on ORL faces retrieved from the database of Cambridge University. From the analysis, the LBPH algorithm achieved 97.5% accuracy; the real-time testing was done on thirty subj
... Show MoreEstimating the semantic similarity between short texts plays an increasingly prominent role in many fields related to text mining and natural language processing applications, especially with the large increase in the volume of textual data that is produced daily. Traditional approaches for calculating the degree of similarity between two texts, based on the words they share, do not perform well with short texts because two similar texts may be written in different terms by employing synonyms. As a result, short texts should be semantically compared. In this paper, a semantic similarity measurement method between texts is presented which combines knowledge-based and corpus-based semantic information to build a semantic network that repre
... Show More