Proxy-based sliding mode control PSMC is an improved version of PID control that combines the features of PID and sliding mode control SMC with continuously dynamic behaviour. However, the stability of the control architecture maybe not well addressed. Consequently, this work is focused on modification of the original version of the proxy-based sliding mode control PSMC by adding an adaptive approximation compensator AAC term for vibration control of an Euler-Bernoulli beam. The role of the AAC term is to compensate for unmodelled dynamics and make the stability proof more easily. The stability of the proposed control algorithm is systematically proved using Lyapunov theory. Multi-modal equation of motion is derived using the Galerkin method. The state variables of the multi-modal equation are expressed in terms of modal amplitudes that should be regulated via the proposed control system. The proposed control structure is implemented on a simply supported beam with two piezo-patches. The simulation experiments are performed using MATLAB/SIMULINK package. The locations of piezo-transducers are optimally placed on the beam. A detailed comparison study is implemented including three scenarios. Scenario 1 includes disturbing the smart beam while no feedback loop is established (open-loop system). In scenario 2, a PD controller is applied on the vibrating beam. Whereas, scenario 3 includes implementation of the PSMC+AAC. For all previously mentioned scenarios, two types of disturbances are applied separately: 1) an impulse force of 1 N peak and 1 s pulse width, and 2) a sinusoidal disturbance with 0.5 N amplitude and 20 Hz frequency. For impulse disturbance signals, the results show the superiority of the PSMC+AAC in comparison with the conventional PD control. Whereas, both the PSMC+ACC and the PD control work well in the case of a sinusoidal disturbance signal and the superiority of the PSMC is not clear.
Empirical equations for estimating thickening time and compressive strength of bentonitic - class "G" cement slurries were derived as a function of water to cement ratio and apparent viscosity (for any ratios). How the presence of such an equations easily extract the thickening time and compressive strength values of the oil field saves time without reference to the untreated control laboratory tests such as pressurized consistometer for thickening time test and Hydraulic Cement Mortars including water bath ( 24 hours ) for compressive strength test those may have more than one day.
This paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used. Experimental results shows LPG-PCA method
... Show MorePrediction of accurate values of residual entropy (SR) is necessary step for the
calculation of the entropy. In this paper, different equations of state were tested for the
available 2791 experimental data points of 20 pure superheated vapor compounds (14
pure nonpolar compounds + 6 pure polar compounds). The Average Absolute
Deviation (AAD) for SR of 2791 experimental data points of the all 20 pure
compounds (nonpolar and polar) when using equations of Lee-Kesler, Peng-
Robinson, Virial truncated to second and to third terms, and Soave-Redlich-Kwong
were 4.0591, 4.5849, 4.9686, 5.0350, and 4.3084 J/mol.K respectively. It was found
from these results that the Lee-Kesler equation was the best (more accurate) one
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreEvaluation of Dot. ELISA test for Diagnosis Visceral Leishmaniasis in Infected Children
<p>Mobility management protocols are very essential in the new research area of Internet of Things (IoT) as the static attributes of nodes are no longer dominant in the current environment. Proxy MIPv6 (PMIPv6) protocol is a network-based mobility management protocol, where the mobility process is relied on the network entities, named, Mobile Access Gateways (MAGs) and Local Mobility Anchor (LMA). PMIPv6 is considered as the most suitable mobility protocol for WSN as it relieves the sensor nodes from participating in the mobility signaling. However, in PMIPv6, a separate signaling is required for each mobile node (MN) registration, which may increase the network signaling overhead and lead to increase the total handoff latency
... Show MoreThe goal of the research is to develop a sustainable rating system for roadway projects in Iraq for all of the life cycle stages of the projects which are (planning, design, construction and operation and maintenance). This paper investigates the criteria and its weightings of the suggested roadway rating system depending on sustainable planning activities. The methodology started in suggesting a group of sustainable criteria for planning stage and then suggesting weights from (1-5) points for each one of it. After that data were collected by using a closed questionnaire directed to the roadway experts group in order to verify the criteria weightings based on the relative importance of the roadway related impacts
... Show MoreIn this research study the synodic month for the moon and theirrelationship with the mean anomaly for the moon orbit and date A.Dand for long periods of time (100 years), we was design a computerprogram that calculates the period of synodic months, and thecoordinates of the moon at the moment of the new moon with highaccuracy. During the 100 year, there are 1236 period of synodicmonths.We found that the when New Moon occurs near perigee (meananomaly = 0°), the length of the synodic month at a minimum.Similarly, when New Moon occurs near apogee (mean anomaly =180°), the length of the synodic month reaches a maximum. Theshortest synodic month on 2053 /1/ 16 and lasted (29.27436) days.The longest synodic month began on 2008 /11/ 27 a
... Show MoreAccountancy unit is looked is upon as unit that established for the purpose achieve it goals and programmers for unlimited time. Unless otherwise take place such as liquation whether voluntary or mandatory. Thus going concern logic is considered to be the logical foundation witch the familiar accounting principles are based upon. The future of a Company real its financial statues and position and the extent of it ability to face events in future. Hence the success and continuity its activities depend on the extent of the company activity to generate profits. And its ability to retain appropriate liquidity to serve its debts.
Therefore financial statements of the company consider to be on
... Show MoreThe Internet image retrieval is an interesting task that needs efforts from image processing and relationship structure analysis. In this paper, has been proposed compressed method when you need to send more than a photo via the internet based on image retrieval. First, face detection is implemented based on local binary patterns. The background is notice based on matching global self-similarities and compared it with the rest of the image backgrounds. The propose algorithm are link the gap between the present image indexing technology, developed in the pixel domain, and the fact that an increasing number of images stored on the computer are previously compressed by JPEG at the source. The similar images are found and send a few images inst
... Show More