The effected of the long transmission line (TL) between the actuator and the hydraulic control valve sometimes essentials. The study is concerned with modeling the TL which carries the oil from the electro-hydraulic servovalve to the actuator. The pressure value inside the TL has been controlled by the electro-hydraulic servovalve as a voltage supplied to the servovalve amplifier. The flow rate through the TL has been simulated by using the lumped π element electrical analogy method for laminar flow. The control voltage supplied to servovalve can be achieved by the direct using of the voltage function generator or indirect C++ program connected to the DAP-view program built in the DAP-card data acqu
... Show MoreRation power plants, to generate power, have become common worldwide. One such one is the steam power plant. In such plants, various moving parts of heavy machines generate a lot of noise. Operators are subjected to high levels of noise. High noise level exposure leads to psychological as well physiological problems; different kinds of ill effects. It results in deteriorated work efficiency, although the exact nature of work performance is still unknown. To predict work efficiency deterioration, neuro-fuzzy tools are being used in research. It has been established that a neuro-fuzzy computing system helps in identification and analysis of fuzzy models. The last decade has seen substantial growth in development of various neuro-fuzzy systems
... Show MoreThis paper presents a novel idea as it investigates the rescue effect of the prey with fluctuation effect for the first time to propose a modified predator-prey model that forms a non-autonomous model. However, the approximation method is utilized to convert the non-autonomous model to an autonomous one by simplifying the mathematical analysis and following the dynamical behaviors. Some theoretical properties of the proposed autonomous model like the boundedness, stability, and Kolmogorov conditions are studied. This paper's analytical results demonstrate that the dynamic behaviors are globally stable and that the rescue effect improves the likelihood of coexistence compared to when there is no rescue impact. Furthermore, numerical simul
... Show MoreIn this paper, a compartmental differential epidemic model of COVID-19 pandemic transmission is constructed and analyzed that accounts for the effects of media coverage. The model can be categorized into eight distinct divisions: susceptible individuals, exposed individuals, quarantine class, infected individuals, isolated class, infectious material in the environment, media coverage, and recovered individuals. The qualitative analysis of the model indicates that the disease-free equilibrium point is asymptotically stable when the basic reproduction number R0 is less than one. Conversely, the endemic equilibrium is globally asymptotically stable when R0 is bigger than one. In addition, a sensitivity analysis is conducted to determine which
... Show MoreIn recent years, Bitcoin has become the most widely used blockchain platform in business and finance. The goal of this work is to find a viable prediction model that incorporates and perhaps improves on a combination of available models. Among the techniques utilized in this paper are exponential smoothing, ARIMA, artificial neural networks (ANNs) models, and prediction combination models. The study's most obvious discovery is that artificial intelligence models improve the results of compound prediction models. The second key discovery was that a strong combination forecasting model that responds to the multiple fluctuations that occur in the bitcoin time series and Error improvement should be used. Based on the results, the prediction acc
... Show MoreA mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the
... Show MoreIndustrial effluents loaded with heavy metals are a cause of hazards to the humans and other forms of life. Conventional approaches, such as electroplating, ion exchange, and membrane processes, are used for removal of copper, cadmium, and lead and are often cost prohibitive with low efficiency at low metal ion concentration. Biosorption can be considered as an option which has been proven as more efficient and economical for removing the mentioned metal ions. Biosorbents used are fungi, yeasts, oil palm shells, coir pith carbon, peanut husks, and olive pulp. Recently, low cost and natural products have also been researched as biosorbent. This paper presents an attempt of the potential use of Iraqi date pits and Al-Khriet (i.e. substances l
... Show MoreA mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the others
... Show MoreIn this study, we made a comparison between LASSO & SCAD methods, which are two special methods for dealing with models in partial quantile regression. (Nadaraya & Watson Kernel) was used to estimate the non-parametric part ;in addition, the rule of thumb method was used to estimate the smoothing bandwidth (h). Penalty methods proved to be efficient in estimating the regression coefficients, but the SCAD method according to the mean squared error criterion (MSE) was the best after estimating the missing data using the mean imputation method
This paper deals with defining Burr-XII, and how to obtain its p.d.f., and CDF, since this distribution is one of failure distribution which is compound distribution from two failure models which are Gamma model and weibull model. Some equipment may have many important parts and the probability distributions representing which may be of different types, so found that Burr by its different compound formulas is the best model to be studied, and estimated its parameter to compute the mean time to failure rate. Here Burr-XII rather than other models is consider because it is used to model a wide variety of phenomena including crop prices, household income, option market price distributions, risk and travel time. It has two shape-parame
... Show More