This paper describes the use of microcomputer as a laboratory instrument system. The system is focused on three weather variables measurement, are temperature, wind speed, and wind direction. This instrument is a type of data acquisition system; in this paper we deal with the design and implementation of data acquisition system based on personal computer (Pentium) using Industry Standard Architecture (ISA)bus. The design of this system involves mainly a hardware implementation, and the software programs that are used for testing, measuring and control. The system can be used to display the required information that can be transferred and processed from the external field to the system. A visual basic language with Microsoft foundation classes (MFC) is the fundamental tool for windows programming. It has been used to build a Man-Machine Interface (MMI), which was used for processing and monitoring acquisition data from environment weather.
Spatial data analysis is performed in order to remove the skewness, a measure of the asymmetry of the probablitiy distribution. It also improve the normality, a key concept of statistics from the concept of normal distribution “bell shape”, of the properties like improving the normality porosity, permeability and saturation which can be are visualized by using histograms. Three steps of spatial analysis are involved here; exploratory data analysis, variogram analysis and finally distributing the properties by using geostatistical algorithms for the properties. Mishrif Formation (unit MB1) in Nasiriya Oil Field was chosen to analyze and model the data for the first eight wells. The field is an anticline structure with northwest- south
... Show MoreIn this paper, the Magnetohydrodynamic (MHD) for Williamson fluid with varying temperature and concentration in an inclined channel with variable viscosity has been examined. The perturbation technique in terms of the Weissenberg number to obtain explicit forms for the velocity field has been used. All the solutions of physical parameters of the Darcy parameter , Reynolds number , Peclet number and Magnetic parameter are discussed under the different values as shown in plots.
This paper presents the Taguchi approach for optimization of hardness for shape memory alloy (Cu-Al-Ni) . The influence of powder metallurgy parameters on hardness has been investigated. Taguchi technique and ANOVA were used for analysis. Nine experimental runs based on Taguchi’s L9 orthogonal array were performed (OA),for two parameters was study (Pressure and sintering temperature) for three different levels (300 ,500 and 700) MPa ,(700 ,800 and 900)oC respectively . Main effect, signal-to-noise (S/N) ratio was study, and analysis of variance (ANOVA) using to investigate the micro-hardness characteristics of the shape memory alloy .after application the result of study shown the hei
... Show MoreMost studies indicated that the values of atmospheric variables have changed from their general rates due to pollution or global warming etc. Hence, the research indicates the changes of direct solar radiation values over a whole century i.e. from 1900 to 2000 depending on registered data for four cities, namely (Mosul - Baghdad - Rutba - Basra. Moreover, attemptsto correlate the direct solar radiation with the temperature values have been recorded over that period. The results showed that there is a decreasing pattern of radiation quantities over time throughout the study period, where the value of direct radiation over the city of Baghdad 5550 w/m2 was recorded in the year 1900, but this ratio decreased cle
... Show MoreIn recent years, the performance of Spatial Data Infrastructures for governments and companies is a task that has gained ample attention. Different categories of geospatial data such as digital maps, coordinates, web maps, aerial and satellite images, etc., are required to realize the geospatial data components of Spatial Data Infrastructures. In general, there are two distinct types of geospatial data sources exist over the Internet: formal and informal data sources. Despite the growth of informal geospatial data sources, the integration between different free sources is not being achieved effectively. The adoption of this task can be considered the main advantage of this research. This article addresses the research question of how the
... Show MoreIn recent years, the performance of Spatial Data Infrastructures for governments and companies is a task that has gained ample attention. Different categories of geospatial data such as digital maps, coordinates, web maps, aerial and satellite images, etc., are required to realize the geospatial data components of Spatial Data Infrastructures. In general, there are two distinct types of geospatial data sources exist over the Internet: formal and informal data sources. Despite the growth of informal geospatial data sources, the integration between different free sources is not being achieved effectively. The adoption of this task can be considered the main advantage of this research. This article addresses the research question of ho
... Show MoreIn many oil-recovery systems, relative permeabilities (kr) are essential flow factors that affect fluid dispersion and output from petroleum resources. Traditionally, taking rock samples from the reservoir and performing suitable laboratory studies is required to get these crucial reservoir properties. Despite the fact that kr is a function of fluid saturation, it is now well established that pore shape and distribution, absolute permeability, wettability, interfacial tension (IFT), and saturation history all influence kr values. These rock/fluid characteristics vary greatly from one reservoir region to the next, and it would be impossible to make kr measurements in all of them. The unsteady-state approach was used to calculate the relat
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More