Interval methods for verified integration of initial value problems (IVPs) for ODEs have been used for more than 40 years. For many classes of IVPs, these methods have the ability to compute guaranteed error bounds for the flow of an ODE, where traditional methods provide only approximations to a solution. Overestimation, however, is a potential drawback of verified methods. For some problems, the computed error bounds become overly pessimistic, or integration even breaks down. The dependency problem and the wrapping effect are particular sources of overestimations in interval computations. Berz (see [1]) and his co-workers have developed Taylor model methods, which extend interval arithmetic with symbolic computations. The latter is an effective tool for reducing both the dependency problem and the wrapping effect. By construction, Taylor model methods appear particularly suitable for integrating nonlinear ODEs. In this paper, we analyze Taylor model based integration of ODEs and compare Taylor model with traditional enclosure methods for IVPs for ODEs. More advanced Taylor model integration methods are discussed in the algorithm (1). For clarity, we summarize the major steps of the naive Taylor model method as algorithm 1.
الخلاصة
اهتم الفكر السياسي في القرنين الاخيرين بدراسة الطبقات على نحو غير مسبوق, واصبح موضوع التحليل الطبقي المعني بالطبقات من حيث تعريفها, وتحديد موقعها في السلم الاجتماعي, فضلاً عن نوعية العلاقة بين شرائحها وفئاتها المختلفة من حيث الصراع والتناغم, المادة الرئيسة والموضوع الاكثر اهمية في دراسات الفكر السياسي والاجتماعي.ومن بين الطبقات, احتلت الطبقة الوسطى مكا
... Show MoreThe optical absorption data of Hydrogenated Amorphous Silicon was analyzed using a Dunstan model of optical absorption in amorphous semiconductors. This model introduces disorder into the band-band absorption through a linear exponential distribution of local energy gaps, and it accounts for both the Urbach and Tauc regions of the optical absorption edge.Compared to other models of similar bases, such as the O’Leary and Guerra models, it is simpler to understand mathematically and has a physical meaning. The optical absorption data of Jackson et al and Maurer et al were successfully interpreted using Dunstan’s model. Useful physical parameters are extracted especially the band to the band energy gap , which is the energy gap in the a
... Show MoreThe aim of this study was to determine the effect on using the McCarthy Model (4MAT) for developing creative writing skills and reflective thinking among undergraduate students. The quasi-experimental approach was adopted. And, in order to achieve the study objective, the educational content of Teaching Ethics (Approach 401), for the plan for the primary grades teacher preparation program was dealt with by using a teaching program based on the McCarthy Model (4MAT) was used.
The study which was done had been based on the academic achievement test for creative writing skills, and the reflective thinking test. The validity and reliability of the study tools were also confirmed. The study was applied to a sample consisting of
... Show MoreThe logistic regression model is an important statistical model showing the relationship between the binary variable and the explanatory variables. The large number of explanations that are usually used to illustrate the response led to the emergence of the problem of linear multiplicity between the explanatory variables that make estimating the parameters of the model not accurate.
... Show MoreThe fast evolution of cyberattacks in the Internet of Things (IoT) area, presents new security challenges concerning Zero Day (ZD) attacks, due to the growth of both numbers and the diversity of new cyberattacks. Furthermore, Intrusion Detection System (IDSs) relying on a dataset of historical or signature‐based datasets often perform poorly in ZD detection. A new technique for detecting zero‐day (ZD) attacks in IoT‐based Conventional Spiking Neural Networks (CSNN), termed ZD‐CSNN, is proposed. The model comprises three key levels: (1) Data Pre‐processing, in this level a thorough cleaning process is applied to the CIC IoT Dataset 2023, which contains both malicious and t
In this paper, an estimate has been made for parameters and the reliability function for Transmuted power function (TPF) distribution through using some estimation methods as proposed new technique for white, percentile, least square, weighted least square and modification moment methods. A simulation was used to generate random data that follow the (TPF) distribution on three experiments (E1 , E2 , E3) of the real values of the parameters, and with sample size (n=10,25,50 and 100) and iteration samples (N=1000), and taking reliability times (0< t < 0) . Comparisons have been made between the obtained results from the estimators using mean square error (MSE). The results showed the
... Show MoreSimulation of the Linguistic Fuzzy Trust Model (LFTM) over oscillating Wireless Sensor Networks (WSNs) where the goodness of the servers belonging to them could change along the time is presented in this paper, and the comparison between the outcomes achieved with LFTM model over oscillating WSNs with the outcomes obtained by applying the model over static WSNs where the servers maintaining always the same goodness, in terms of the selection percentage of trustworthy servers (the accuracy of the model) and the average path length are also presented here. Also in this paper the comparison between the LFTM and the Bio-inspired Trust and Reputation Model for Wireless Sensor Network
... Show MoreThe theory of probabilistic programming may be conceived in several different ways. As a method of programming it analyses the implications of probabilistic variations in the parameter space of linear or nonlinear programming model. The generating mechanism of such probabilistic variations in the economic models may be due to incomplete information about changes in demand, production and technology, specification errors about the econometric relations presumed for different economic agents, uncertainty of various sorts and the consequences of imperfect aggregation or disaggregating of economic variables. In this Research we discuss the probabilistic programming problem when the coefficient bi is random variable
... Show More