ArcHydro is a model developed for building hydrologic information systems to synthesize geospatial and temporal water resources data that support hydrologic modeling and analysis. Raster-based digital elevation models (DEMs) play an important role in distributed hydrologic modeling supported by geographic information systems (GIS). Digital Elevation Model (DEM) data have been used to derive hydrological features, which serve as inputs to various models. Currently, elevation data are available from several major sources and at different spatial resolutions. Detailed delineation of drainage networks is the first step for many natural resource management studies. Compared with interpretation from aerial photographs or topographic maps, automation of drainage network extraction from DEMs is an efficient way and has received considerable attention. This study aims to extract drainage networks from Digital Elevation Model (DEM) for Lesser Zab River Basin. Composition parameters of the drainage network including the numbers of streams and the stream lengths are derived from the DEM beside the delineation of catchment areas in the basin. The results from this application can be used to create input files for many hydrologic models.
The adsorption isotherms and kinetic uptakes of Carbon Dioxide (CO2) on fabricated electrospun nonwoven activated carbon nanofiber sheets were investigated at two different temperatures, 308 K and 343 K, over a pressure range of 1 to 7 bar. The activated carbon nanofiber-based on polymer (PAN) precursor was fabricated via electrospinning technique followed by thermal treatment to obtain the carboneous nanofibers. The obtained data of CO2 adsorption isotherm was fitted to various models, including Langmuir, Freundlich, and Temkin. Based on correlation coefficients, the Langmuir isotherm model presented the best fitting with CO2 adsorption isotherms’ experimental data. Raising the equ
Due to the vast using of digital images and the fast evolution in computer science and especially the using of images in the social network.This lead to focus on securing these images and protect it against attackers, many techniques are proposed to achieve this goal. In this paper we proposed a new chaotic method to enhance AES (Advanced Encryption Standards) by eliminating Mix-Columns transformation to reduce time consuming and using palmprint biometric and Lorenz chaotic system to enhance authentication and security of the image, by using chaotic system that adds more sensitivity to the encryption system and authentication for the system.
Data security is an important component of data communication and transmission systems. Its main role is to keep sensitive information safe and integrated from the sender to the receiver. The proposed system aims to secure text messages through two security principles encryption and steganography. The system produced a novel method for encryption using graph theory properties; it formed a graph from a password to generate an encryption key as a weight matrix of that graph and invested the Least Significant Bit (LSB) method for hiding the encrypted message in a colored image within a green component. Practical experiments of (perceptibility, capacity, and robustness) were calculated using similarity measures like PSNR, MSE, and
... Show Moreيقترح هذا البحث طريقة جديدة لتقدير دالة كثافة الرابطة باستخدام تحليل المويجات كطريقة لامعلمية، من أجل الحصول على نتائج أكثر دقة وخالية من مشكلة تاثيرات الحدود التي تعاني منها طرائق التقدير اللامعلمية. اذ تعد طريقة المويجات طريقة اوتماتيكية للتعامل مع تاثيرات الحدود وذلك لانها لا تأخذ بنظر الاعتبار إذا كانت السلسلة الزمنية مستقرة او غير مستقرة. ولتقدير دالة كثافة الرابطة تم استعمال المحاكاة لتوليد البي
... Show MoreSome problems want to be solved in image compression to make the process workable and more efficient. Much work had been done in the field of lossy image compression based on wavelet and Discrete Cosine Transform (DCT). In this paper, an efficient image compression scheme is proposed, based on a common encoding transform scheme; It consists of the following steps: 1) bi-orthogonal (tab 9/7) wavelet transform to split the image data into sub-bands, 2) DCT to de-correlate the data, 3) the combined transform stage's output is subjected to scalar quantization before being mapped to positive, 4) and LZW encoding to produce the compressed data. The peak signal-to-noise (PSNR), compression ratio (CR), and compression gain (CG) measures were used t
... Show MoreMedical image segmentation is one of the most actively studied fields in the past few decades, as the development of modern imaging modalities such as magnetic resonance imaging (MRI) and computed tomography (CT), physicians and technicians nowadays have to process the increasing number and size of medical images. Therefore, efficient and accurate computational segmentation algorithms become necessary to extract the desired information from these large data sets. Moreover, sophisticated segmentation algorithms can help the physicians delineate better the anatomical structures presented in the input images, enhance the accuracy of medical diagnosis and facilitate the best treatment planning. Many of the proposed algorithms could perform w
... Show MoreThe aim of this paper is to determine the feasibility of using fluorometric methods as an indicator for quality and contamination of milk with E.coli bacteria, and selection the suitable wavelength to be used with laser induced auto fluorescence. Three groups of milk samples were used in this study: Fresh pasteurized milk samples, milk samples containing different concentration of E.coli bacteria which were added artificially, and milk samples that were kept in refrigerator for 3-5 days. Thirteen excitation wavelengths were used to get the emission spectra for all milk samples using spectroflourometer .The results showed that the emission spectra at 275nm excitation wavelength gave a good differentiation between these three groups.
... Show MoreThis research aims to review the importance of estimating the nonparametric regression function using so-called Canonical Kernel which depends on re-scale the smoothing parameter, which has a large and important role in Kernel and give the sound amount of smoothing .
We has been shown the importance of this method through the application of these concepts on real data refer to international exchange rates to the U.S. dollar against the Japanese yen for the period from January 2007 to March 2010. The results demonstrated preference the nonparametric estimator with Gaussian on the other nonparametric and parametric regression estima
... Show MoreIn this paper, the reliability and scheduling of maintenance of some medical devices were estimated by one variable, the time variable (failure times) on the assumption that the time variable for all devices has the same distribution as (Weibull distribution.
The method of estimating the distribution parameters for each device was the OLS method.
The main objective of this research is to determine the optimal time for preventive maintenance of medical devices. Two methods were adopted to estimate the optimal time of preventive maintenance. The first method depends on the maintenance schedule by relying on information on the cost of maintenance and the cost of stopping work and acc
... Show MoreThe application of ultrafiltration (UF) and nanofiltration (NF) processes in the handling of raw produced water have been investigated in the present study. Experiments of both ultrafiltration and nanofiltration processes are performed in a laboratory unit, which is operated in a cross-flow pattern. Various types of hollow fiber membranes were utilized in this study such as poly vinyl chloride (PVC) UF membrane, two different polyether sulfone (PES) NF membranes, and poly phenyl sulfone PPSU NF membrane. It was found that the turbidity of the treated water is higher than 95 % by using UF and NF membranes. The chemical oxygen demand COD (160 mg/l) and Oil content (26.8 mg/l) were found after treatment according to the allowable limits set
... Show More