Average interstellar extinction curves for Galaxy and Large Magellanic Cloud (LMC) over the range of wavelengths (1100 A0 – 3200 A0) were obtained from observations via IUE satellite. The two extinctions of our galaxy and LMC are normalized to Av=0 and E (B-V)=1, to meat standard criteria. It is found that the differences between the two extinction curves appeared obviously at the middle and far ultraviolet regions due to the presence of different populations of small grains, which have very little contribution at longer wavelengths. Using new IUE-Reduction techniques lead to more accurate result.
The problem of Multicollinearity is one of the most common problems, which deal to a large extent with the internal correlation between explanatory variables. This problem is especially Appear in economics and applied research, The problem of Multicollinearity has a negative effect on the regression model, such as oversized variance degree and estimation of parameters that are unstable when we use the Least Square Method ( OLS), Therefore, other methods were used to estimate the parameters of the negative binomial model, including the estimated Ridge Regression Method and the Liu type estimator, The negative binomial regression model is a nonline
... Show MoreThis study is concerned with the comparison of the results of some tests of passing and dribbling of the basketball of tow different years between teams of chosen young players in Baghdad. Calculative methods were used namely (Arithmetic mean, Value digression and T.test for incompatible specimens). After careful calculative treatments, it has been that there were abstract or no abstract differences in the find results of chestpass, highdribble and cross-over dribble. The clubs were: (Al-Khark, Air defence, Police and Al-Adamiyah) each one separate from the other for the year (2000-2001). After all that many findings were reached such as the lack of objective valuation (periodical tests) between one sport season and the other. In the light
... Show MoreAn analytical approach based on field data was used to determine the strength capacity of large diameter bored type piles. Also the deformations and settlements were evaluated for both vertical and lateral loadings. The analytical predictions are compared to field data obtained from a proto-type test pile used at Tharthar –Tigris canal Bridge. They were found to be with acceptable agreement of 12% deviation.
Following ASTM standards D1143M-07e1,2010, a test schedule of five loading cycles were proposed for vertical loads and series of cyclic loads to simulate horizontal loading .The load test results and analytical data of 1.95
... Show MoreSpatial data observed on a group of areal units is common in scientific applications. The usual hierarchical approach for modeling this kind of dataset is to introduce a spatial random effect with an autoregressive prior. However, the usual Markov chain Monte Carlo scheme for this hierarchical framework requires the spatial effects to be sampled from their full conditional posteriors one-by-one resulting in poor mixing. More importantly, it makes the model computationally inefficient for datasets with large number of units. In this article, we propose a Bayesian approach that uses the spectral structure of the adjacency to construct a low-rank expansion for modeling spatial dependence. We propose a pair of computationally efficient estimati
... Show MoreTo accommodate utilities in buildings, different sizes of openings are provided in the web of reinforced concrete deep beams, which cause reductions in the beam strength and stiffness. This paper aims to investigate experimentally and numerically the effectiveness of using carbon fiber reinforced polymer (CFRP) strips, as a strengthening technique, to externally strengthen reinforced concrete continuous deep beams (RCCDBs) with large openings. The experimental work included testing three RCCDBs under five-point bending. A reference specimen was prepared without openings to explore the reductions in strength and stiffness after providing large openings. Openings were created symmetrically at the center of spans of the other specimens
... Show MoreAbstract: -
The concept of joint integration of important concepts in macroeconomic application, the idea of cointegration is due to the Granger (1981), and he explained it in detail in Granger and Engle in Econometrica (1987). The introduction of the joint analysis of integration in econometrics in the mid-eighties of the last century, is one of the most important developments in the experimental method for modeling, and the advantage is simply the account and use it only needs to familiarize them selves with ordinary least squares.
Cointegration seen relations equilibrium time series in the long run, even if it contained all the sequences on t
... Show MoreIn this study, simple, low cost, precise and speed spectrophotometric methods development for evaluation of sulfacetamide sodium are described. The primary approach contains conversion of sulfacetamide sodium to diazonium salt followed by a reaction with p-cresol as a reagent in the alkaline media. The colored product has an orange colour with absorbance at λmax 450 nm. At the concentration range of (5.0-100 µg.mL-1), the Beer̆ s Low is obeyed with correlation coefficient (R2= 0.9996), limit of detection as 0.2142 µg.mL-1, limit of quantification as 0.707 µg.mL-1 and molar absorptivity as 1488.249 L.mol-1.cm-1. The other approach, cloud point extraction w
... Show MoreThe evolution of the Internet of things (IoT) led to connect billions of heterogeneous physical devices together to improve the quality of human life by collecting data from their environment. However, there is a need to store huge data in big storage and high computational capabilities. Cloud computing can be used to store big data. The data of IoT devices is transferred using two types of protocols: Message Queuing Telemetry Transport (MQTT) and Hypertext Transfer Protocol (HTTP). This paper aims to make a high performance and more reliable system through efficient use of resources. Thus, load balancing in cloud computing is used to dynamically distribute the workload across nodes to avoid overloading any individual r
... Show MoreThe need for cloud services has been raised globally to provide a platform for healthcare providers to efficiently manage their citizens’ health records and thus provide treatment remotely. In Iraq, the healthcare records of public hospitals are increasing progressively with poor digital management. While recent works indicate cloud computing as a platform for all sectors globally, a lack of empirical evidence demands a comprehensive investigation to identify the significant factors that influence the utilization of cloud health computing. Here we provide a cost-effective, modular, and computationally efficient model of utilizing cloud computing based on the organization theory and the theory of reasoned action perspectives. A tot
... Show More