Interval methods for verified integration of initial value problems (IVPs) for ODEs have been used for more than 40 years. For many classes of IVPs, these methods have the ability to compute guaranteed error bounds for the flow of an ODE, where traditional methods provide only approximations to a solution. Overestimation, however, is a potential drawback of verified methods. For some problems, the computed error bounds become overly pessimistic, or integration even breaks down. The dependency problem and the wrapping effect are particular sources of overestimations in interval computations. Berz (see [1]) and his co-workers have developed Taylor model methods, which extend interval arithmetic with symbolic computations. The latter is an effective tool for reducing both the dependency problem and the wrapping effect. By construction, Taylor model methods appear particularly suitable for integrating nonlinear ODEs. In this paper, we analyze Taylor model based integration of ODEs and compare Taylor model with traditional enclosure methods for IVPs for ODEs. More advanced Taylor model integration methods are discussed in the algorithm (1). For clarity, we summarize the major steps of the naive Taylor model method as algorithm 1.
The printing designer's creative thinking is a deliberate mental process based on specific skills that stimulate the motivation of the student to learn and call for new information for the investigation and research to discover the problems and attitudes and through reformulating the experience in new patterns depending on the active imagination and the flexible scientific thinking through providing the largest number possible of various unfamiliar printing design models, and testing their suitability and then readjusting the results with the availability of suitable educational, learning and academic atmosphere.
The designer's creative thinking depends on main skills. Fluency skill is to put t
... Show MoreThe current study aims to compare between the assessments of the Rush model’s parameters to the missing and completed data in various ways of processing the missing data. To achieve the aim of the present study, the researcher followed the following steps: preparing Philip Carter test for the spatial capacity which consists of (20) items on a group of (250) sixth scientific stage students in the directorates of Baghdad Education at Al–Rusafa (1st, 2nd and 3rd) for the academic year (2018-2019). Then, the researcher relied on a single-parameter model to analyze the data. The researcher used Bilog-mg3 model to check the hypotheses, data and match them with the model. In addition
... Show MoreAn optimization calculation is made to find the optimum properties of combined quadrupole lens which consists of electrostatic and magnetic lens. Both chromatic and spherical aberration coefficients are reduced to minimum values and the achromatic aberration is found for many cases. These calculations are achieved with the aid of transfer matrices method and using rectangular model of field distribution, where the path of charged-particles beam traversing the field has been determined by solving the trajectory equation of motion and then the optical properties for lens have been computed with the aid of the beam trajectory along the lens axis. The computations have been concentrated on determining the chromatic and spher
... Show MoreIt is commonly known that Euler-Bernoulli’s thin beam theorem is not applicable whenever a nonlinear distribution of strain/stress occurs, such as in deep beams, or the stress distribution is discontinuous. In order to design the members experiencing such distorted stress regions, the Strut-and-Tie Model (STM) could be utilized. In this paper, experimental investigation of STM technique for three identical small-scale deep beams was conducted. The beams were simply supported and loaded statically with a concentrated load at the mid span of the beams. These deep beams had two symmetrical openings near the application point of loading. Both the deep beam, where the stress distribution cannot be assumed linear, and the ex
... Show MoreThis research represents a practical attempt applied to calibrate and verify a hydraulic model for the Blue Nile River. The calibration procedures are performed using the observed data for a previous period and comparing them with the calibration results while verification requirements are achieved with the application of the observed data for another future period and comparing them with the verification results. The study objective covered a relationship of the river terrain with the distance between the assumed points of the dam failures along the river length. The computed model values and the observed data should conform to the theoretical analysis and the overall verification performance of the model by comparing i
... Show MoreThis research represents a practical attempt applied to calibrate and verify a hydraulic model for the Blue Nile River. The calibration procedures are performed using the observed data for a previous period and comparing them with the calibration results while verification requirements are achieved with the application of the observed data for another future period and comparing them with the verification results. The study objective covered a relationship of the river terrain with the distance between the assumed points of the dam failures along the river length. The computed model values and the observed data should conform to the theoretical analysis and the overall verification performance of the model by comparing it with anothe
... Show MoreProductivity estimating of ready mixed concrete batch plant is an essential tool for the successful completion of the construction process. It is defined as the output of the system per unit of time. Usually, the actual productivity values of construction equipment in the site are not consistent with the nominal ones. Therefore, it is necessary to make a comprehensive evaluation of the nominal productivity of equipment concerning the effected factors and then re-evaluate them according to the actual values.
In this paper, the forecasting system was employed is an Artificial Intelligence technique (AI). It is represented by Artificial Neural Network (ANN) to establish the predicted model to estimate wet ready mixe
... Show MoreThe basic analytical formula for particle-hole state densities is derived based on the non-Equidistant Spacing Model (non-ESM) for the single-particle level density (s.p.l.d.) dependence on particle excitation energy u. Two methods are illustrated in this work, the first depends on Taylor series expansion of the s.p.l.d. about u, while the second uses direct analytical derivation of the state density formula. This treatment is applied for a system composing from one kind of fermions and for uncorrected physical system. The important corrections due to Pauli blocking was added to the present formula. Analytical comparisons with the standard formulae for ESM are made and it is shown that the solution reduces to earlier formulae providing m
... Show MoreAs the process of estimate for model and variable selection significant is a crucial process in the semi-parametric modeling At the beginning of the modeling process often At there are many explanatory variables to Avoid the loss of any explanatory elements may be important as a result , the selection of significant variables become necessary , so the process of variable selection is not intended to simplifying model complexity explanation , and also predicting. In this research was to use some of the semi-parametric methods (LASSO-MAVE , MAVE and The proposal method (Adaptive LASSO-MAVE) for variable selection and estimate semi-parametric single index model (SSIM) at the same time .
... Show MoreThe logistic regression model is one of the oldest and most common of the regression models, and it is known as one of the statistical methods used to describe and estimate the relationship between a dependent random variable and explanatory random variables. Several methods are used to estimate this model, including the bootstrap method, which is one of the estimation methods that depend on the principle of sampling with return, and is represented by a sample reshaping that includes (n) of the elements drawn by randomly returning from (N) from the original data, It is a computational method used to determine the measure of accuracy to estimate the statistics, and for this reason, this method was used to find more accurate estimates. The ma
... Show More