A load-shedding controller suitable for small to medium size loads is designed and implemented based on preprogrammed priorities and power consumption for individual loads. The main controller decides if a particular load can be switched ON or not according to the amount of available power generation, load consumption and loads priorities. When themaximum allowed power consumption is reached and the user want to deliver power to additional load, the controller will decide if this particular load should be denied receiving power if its priority is low. Otherwise, it can be granted to receive power if its priority is high and in this case lower priority loads are automatically switched OFF in order not to overload the power generation. The main idea of the proposed LS controller is to minimize the amount of the isolated load without overloading the power system. In this paper, three versions of load shedding controller were implemented using Altera DE2-115 FPGA; with number of loads equal 32, 64 and 128 for each controller.
This research aims to review the importance of estimating the nonparametric regression function using so-called Canonical Kernel which depends on re-scale the smoothing parameter, which has a large and important role in Kernel and give the sound amount of smoothing .
We has been shown the importance of this method through the application of these concepts on real data refer to international exchange rates to the U.S. dollar against the Japanese yen for the period from January 2007 to March 2010. The results demonstrated preference the nonparametric estimator with Gaussian on the other nonparametric and parametric regression estima
... Show MoreIn this paper, the reliability and scheduling of maintenance of some medical devices were estimated by one variable, the time variable (failure times) on the assumption that the time variable for all devices has the same distribution as (Weibull distribution.
The method of estimating the distribution parameters for each device was the OLS method.
The main objective of this research is to determine the optimal time for preventive maintenance of medical devices. Two methods were adopted to estimate the optimal time of preventive maintenance. The first method depends on the maintenance schedule by relying on information on the cost of maintenance and the cost of stopping work and acc
... Show MoreThe application of ultrafiltration (UF) and nanofiltration (NF) processes in the handling of raw produced water have been investigated in the present study. Experiments of both ultrafiltration and nanofiltration processes are performed in a laboratory unit, which is operated in a cross-flow pattern. Various types of hollow fiber membranes were utilized in this study such as poly vinyl chloride (PVC) UF membrane, two different polyether sulfone (PES) NF membranes, and poly phenyl sulfone PPSU NF membrane. It was found that the turbidity of the treated water is higher than 95 % by using UF and NF membranes. The chemical oxygen demand COD (160 mg/l) and Oil content (26.8 mg/l) were found after treatment according to the allowable limits set
... Show MoreDrilling fluid loss during drilling operation is undesirable, expensive and potentially hazardous problem.
Nasiriyah oil field is one of the Iraqi oil field that suffer from lost circulation problem. It is known that Dammam, um-Radoma, Tayarat, Shiranish and Hartha are the detecting layers of loss circulation problem. Different type of loss circulation materials (LCMs) ranging from granular, flakes and fibrous were used previously to treat this problem.
This study presents the application of rice as a lost circulation material that used to mitigate and stop the loss problem when partial or total losses occurred.
The experim
... Show MoreIn this paper, we deal with games of fuzzy payoffs problem while there is uncertainty in data. We use the trapezoidal membership function to transform the data into fuzzy numbers and utilize the three different ranking function algorithms. Then we compare between these three ranking algorithms by using trapezoidal fuzzy numbers for the decision maker to get the best gains
In this paper, we implement and examine a Simulink model with electroencephalography (EEG) to control many actuators based on brain waves. This will be in great demand since it will be useful for certain individuals who are unable to access some control units that need direct contact with humans. In the beginning, ten volunteers of a wide range of (20-66) participated in this study, and the statistical measurements were first calculated for all eight channels. Then the number of channels was reduced by half according to the activation of brain regions within the utilized protocol and the processing time also decreased. Consequently, four of the participants (three males and one female) were chosen to examine the Simulink model duri
... Show MoreThe paper presents a neural synchronization into intensive study in order to address challenges preventing from adopting it as an alternative key exchange algorithm. The results obtained from the implementation of neural synchronization with this proposed system address two challenges: namely the verification of establishing the synchronization between the two neural networks, and the public initiation of the input vector for each party. Solutions are presented and mathematical model is developed and presented, and as this proposed system focuses on stream cipher; a system of LFSRs (linear feedback shift registers) has been used with a balanced memory to generate the key. The initializations of these LFSRs are neural weights after achiev
... Show MoreIn this paper three techniques for image compression are implemented. The proposed techniques consist of three dimension (3-D) two level discrete wavelet transform (DWT), 3-D two level discrete multi-wavelet transform (DMWT) and 3-D two level hybrid (wavelet-multiwavelet transform) technique. Daubechies and Haar are used in discrete wavelet transform and Critically Sampled preprocessing is used in discrete multi-wavelet transform. The aim is to maintain to increase the compression ratio (CR) with respect to increase the level of the transformation in case of 3-D transformation, so, the compression ratio is measured for each level. To get a good compression, the image data properties, were measured, such as, image entropy (He), percent r
... Show MorePermeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas