The experiment aimed to compare different methods of measuring the Feed pellet durability through the effect of pellet die speeds and the particle size (mill sieve holes diameter). Feed pellet durability was studied in four different ways: pellet direct measurement (%), pellet lengths (%), pellet water absorption (%), pellet durability by drop box device (%), pellet durability by air pressure device (%). Three pellet die speeds 280, 300, and 320 rpm, three mill sieve holes diameter 2, 4, and 6 mm, have been used. The results showed that increasing the pellet die speeds from 280 to 300 then to 320 rpm led to a significant decrease in the feed pellet durability by direct measurement, drop box device, and air pressure device, while pellet water absorption a significant increased, whereas it did not significantly affect the pellet lengths. Increasing the sieve holes diameter from 2 to 4 then to 6 mm led to a significant decrease in the feed pellet durability in pellet lengths, drop box deviceand air pressure device, pellet water absorption increased, whereas it did not significantly affect the direct measurement of pellet. Pellet die speeds of 280 rpm and the sieve holes diameter of 2 mm recorded the highest pellet durability for all ways: direct measurement 94.66 %, pellet lengths 85.94%, the drop box device 93.42% and the air pressure device 91.21%, less pellet water absorption 38.98%.
In this paper, we derived an estimator of reliability function for Laplace distribution with two parameters using Bayes method with square error loss function, Jeffery’s formula and conditional probability random variable of observation. The main objective of this study is to find the efficiency of the derived Bayesian estimator compared to the maximum likelihood of this function and moment method using simulation technique by Monte Carlo method under different Laplace distribution parameters and sample sizes. The consequences have shown that Bayes estimator has been more efficient than the maximum likelihood estimator and moment estimator in all samples sizes
The aim of this paper is to introduce the concepts of asymptotically p-contractive and asymptotically severe accretive mappings. Also, we give an iterative methods (two step-three step) for finite family of asymptotically p-contractive and asymptotically severe accretive mappings to solve types of equations.
The primary objective of the current paper is to suggest and implement effective computational methods (DECMs) to calculate analytic and approximate solutions to the nonlocal one-dimensional parabolic equation which is utilized to model specific real-world applications. The powerful and elegant methods that are used orthogonal basis functions to describe the solution as a double power series have been developed, namely the Bernstein, Legendre, Chebyshev, Hermite, and Bernoulli polynomials. Hence, a specified partial differential equation is reduced to a system of linear algebraic equations that can be solved by using Mathematica®12. The techniques of effective computational methods (DECMs) have been applied to solve some s
... Show MoreRadiation therapy plays an important role in improving breast cancer cases, in order to obtain an appropriateestimate of radiation doses number given to the patient after tumor removal; some methods of nonparametric regression werecompared. The Kernel method was used by Nadaraya-Watson estimator to find the estimation regression function forsmoothing data based on the smoothing parameter h according to the Normal scale method (NSM), Least Squared CrossValidation method (LSCV) and Golden Rate Method (GRM). These methods were compared by simulation for samples ofthree sizes, the method (NSM) proved to be the best according to average of Mean Squares Error criterion and the method(LSCV) proved to be the best according to Average of Mean Absolu
... Show Moresummary
In this search, we examined the factorial experiments and the study of the significance of the main effects, the interaction of the factors and their simple effects by the F test (ANOVA) for analyze the data of the factorial experience. It is also known that the analysis of variance requires several assumptions to achieve them, Therefore, in case of violation of one of these conditions we conduct a transform to the data in order to match or achieve the conditions of analysis of variance, but it was noted that these transfers do not produce accurate results, so we resort to tests or non-parametric methods that work as a solution or alternative to the parametric tests , these method
... Show MoreComplexity is the inherent characteristic of contemporary organizations. It is characterized by the intertwining and expansion of its relations, by the severe disorder and rapid change in its environment, which makes it suffer from a state of uncertainty in determining the direction of its future or the assessment of the rules governing its paths. All organizations tend to evolve with increasing sophistication, And to take measures that contribute to the simplification of the system as it moves towards complexity, allowing the administration to easily control its movement and directions, and the problem of complexity in the university is based on the entanglement and overlap in the goals and processes betwe
... Show MoreThe research aims to clarify the COBIT5 framework for IT governance and to develop of a criterion based on Balanced Scorecard that contributes in measuring the performance of IT governance. To achieve these goals, the researchers adopted the deductive approach in the design of balanced scorecard to measure the IT governance at the Bank of Baghdad that was chosen because it relied heavily on IT.
The research has reached a number of conclusions, the most important of which is that the performance of IT department in the Bank of Baghdad falls within the good level that requires constant monitoring, the most committed items of Balanced Scorecard by the Bank were customer, internal operation, growth and finally the financial item; IT
... Show More