The ultimate goal of any sale contract is to maximize the combined returns of the parties, knowing that these returns are not realized (in long-term contracts) except in the final stages of the contract. Therefore, this requires the parties to the contract to leave some elements open, including the price, because the adoption of a fixed price and inflexible will not be appropriate to meet their desires when contracting, especially with ignorance of matters beyond their will and may affect the market conditions, and the possibility of modifying the fixed price through The elimination is very limited, especially when the parties to the contract are equally in terms of economic strength. Hence, in order to respond to market uncertainties, the parties have been allowed to combine the fixed and open conditions of the contract. They are accurate, allowing flexible conditions to cope with unexpected market fluctuations. In this combination of conditions, the seller and the buyer leave a set of elements in the contract open, for example (price, time, quantity) to maintain the functional nature of the business.
In this paper, the process of comparison between the tree regression model and the negative binomial regression. As these models included two types of statistical methods represented by the first type "non parameter statistic" which is the tree regression that aims to divide the data set into subgroups, and the second type is the "parameter statistic" of negative binomial regression, which is usually used when dealing with medical data, especially when dealing with large sample sizes. Comparison of these methods according to the average mean squares error (MSE) and using the simulation of the experiment and taking different sample
... Show MoreIn this paper, we will study non parametric model when the response variable have missing data (non response) in observations it under missing mechanisms MCAR, then we suggest Kernel-Based Non-Parametric Single-Imputation instead of missing value and compare it with Nearest Neighbor Imputation by using the simulation about some difference models and with difference cases as the sample size, variance and rate of missing data.
A mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the
... Show Moreيھدف البحث الى اجراء تقدير دالة المعولية لتوزيــع ويبل ذي المعلمتين بالطرائـق المعلميــة والمتمثلة بـ (NWLSM,RRXM,RRYM,MOM,MLM (، وكذلك اجراء تقدير لدالة المعولية بالطرائق الالمعلمية والمتمثلة بـ . (EM, PLEM, EKMEM, WEKM, MKMM, WMR, MMO, MMT) وتم استخدام اسلوب المحاكاة لغرض المقارنة باستخدام حجوم عينات مختلفة (20,40,60,80,100) والوصول الى افضل الطرائق في التقدير باالعتماد على المؤشر االحصائي متوسط مربعات الخطا التكاملي (IMSE(، وقد توصل البحث الى
... Show MoreHere’s a research about epistemology including answer for the most essential questions those which associate with human knowledge, It’s the sources of knowledge and it's pathways depending on showing and criticizing the empirical doctrine represented by John Locke, David Hume and others, not only the empirical doctrine but also the contemporary scientism doctrine the one that characterized by science abilities glorification which is represented by contemporary scientists such as: Stephen Hawking and some few others .The common points between both of the doctrines (scientism and empirical) will be clarified for the reader and also the uncommon ones, ultimately, we will briefly spotlight on the basics of the Islamic vision about knowle
... Show Moreيواجه المصرف تحولات عديدة أثناء سير عمله ولا سيما قد تحول من مصرف صناعي يسعى الى تحقيق التنمية الصناعية ، من خلال منحه قروض وتسهيلات تنموية وتدعمه الدوله ، الى مصرف شامل يسعى الى تحقيق الربحية في ظل تنويع الأنشطة والخدمات والعمليات الائتمانية.يهدف البحث الى دراسة التحولات التي حدثت في المصرف الصناعي، وتأثير هذا التحول على النشاط الائتماني. وقد استند في ذلك على فرضية رئيسة وهي :-
... Show More
In 2020 one of the researchers in this paper, in his first research, tried to find out the Modified Weighted Pareto Distribution of Type I by using the Azzalini method for weighted distributions, which contain three parameters, two of them for scale while the third for shape.This research compared the distribution with two other distributions from the same family; the Standard Pareto Distribution of Type I and the Generalized Pareto Distribution by using the Maximum likelihood estimator which was derived by the researchers for Modified Weighted Pareto Distribution of Type I, then the Mont Carlo method was used–that is one of the simulation manners for generating random samples data in different sizes ( n= 10,30,50), and in di
... Show MoreRegression models are one of the most important models used in modern studies, especially research and health studies because of the important results they achieve. Two regression models were used: Poisson Regression Model and Conway-Max Well- Poisson), where this study aimed to make a comparison between the two models and choose the best one between them using the simulation method and at different sample sizes (n = 25,50,100) and with repetitions (r = 1000). The Matlab program was adopted.) to conduct a simulation experiment, where the results showed the superiority of the Poisson model through the mean square error criterion (MSE) and also through the Akaiki criterion (AIC) for the same distribution.
Paper type:
... Show More In this paper the research represents an attempt of expansion in using the parametric and non-parametric estimators to estimate the median effective dose ( ED50 ) in the quintal bioassay and comparing between these methods . We have Chosen three estimators for Comparison. The first estimator is
( Spearman-Karber ) and the second estimator is ( Moving Average ) and The Third estimator is ( Extreme Effective Dose ) . We used a minimize Chi-square as a parametric method. We made a Comparison for these estimators by calculating the mean square error of (ED50) for each one of them and comparing it with the optimal the mean square
The repeated measurement design is called a complete randomized block design for repeated measurement when the subject is given the all different treatments , in this case the subject is considered as a block . Many of nonparametric methods were considered like Friedman test (1937) and Koch test(1969) and Kepner&Robinson test(1988) when the assumption of normal distribution of the data is not satisfied .as well as F test when the assumptions of the analysis of variance is satisfied ,where the observations within blocks are assumed to be equally correlated . The purpose of this paper is to summarize the result of the simulation study for comparing these methods as well as present the suggested
Me
... Show More