This study is unique in this field. It represents a mix of three branches of technology: photometry, spectroscopy, and image processing. The work treats the image by treating each pixel in the image based on its color, where the color means a specific wavelength on the RGB line; therefore, any image will have many wavelengths from all its pixels. The results of the study are specific and identify the elements on the nucleus’s surface of a comet, not only the details but also their mapping on the nucleus. The work considered 12 elements in two comets (Temple 1 and 67P/Churyumoy-Gerasimenko). The elements have strong emission lines in the visible range, which were recognized by our MATLAB program in the treatment of the image. The percentage of the elements was determined relative to iron, where in comet Temple 1, the most significant percentage of the element ratio potassium to iron is K / Fe ~ 28.2%, while the lowest value is Ca / Fe ~ 1.3%. For the comet, 67P/Churyumov-Gerasimenko, the most significant percentage of the elements relative to iron is also for potassium, K / Fe ~ 89.5%; while the lowest value is Ni / Fe ~ 0.26. In general, comparing both comets, the greatest percentage of the elements relative to iron is K / F. Iron is the base element in the structure of both comets, followed by potassium.
Semiparametric methods combined parametric methods and nonparametric methods ,it is important in most of studies which take in it's nature more progress in the procedure of accurate statistical analysis which aim getting estimators efficient, the partial linear regression model is considered the most popular type of semiparametric models, which consisted of parametric component and nonparametric component in order to estimate the parametric component that have certain properties depend on the assumptions concerning the parametric component, where the absence of assumptions, parametric component will have several problems for example multicollinearity means (explanatory variables are interrelated to each other) , To treat this problem we use
... Show MoreLearning programming is among the top challenges in computer science education. A part of that, program visualization (PV) is used as a tool to overcome the high failure and drop-out rates in an introductory programming course. Nevertheless, there are rising concerns about the effectiveness of the existing PV tools following the mixed results derived from various studies. Student engagement is also considered a vital factor in building a successful PV, while it is also an important part of the learning process in general. Several techniques have been introduced to enhance PV engagement; however, student engagement with PV is still challenging. This paper employed three theories—constructivism, social constructivism and cognitive load t
... Show MoreA Stereomicroscopic Evaluation of Four Endodontic Sealers Penetration into Artificial Lateral Canals Using Gutta-Percha Single Cone Obturation Technique, Omar Jihad Banawi*, Raghad
The introduction of concrete damage plasticity material models has significantly improved the accuracy with which the concrete structural elements can be predicted in terms of their structural response. Research into this method's accuracy in analyzing complex concrete forms has been limited. A damage model combined with a plasticity model, based on continuum damage mechanics, is recommended for effectively predicting and simulating concrete behaviour. The damage parameters, such as compressive and tensile damages, can be defined to simulate concrete behavior in a damaged-plasticity model accurately. This research aims to propose an analytical model for assessing concrete compressive damage based on stiffness deterioration. The prop
... Show MoreInterval methods for verified integration of initial value problems (IVPs) for ODEs have been used for more than 40 years. For many classes of IVPs, these methods have the ability to compute guaranteed error bounds for the flow of an ODE, where traditional methods provide only approximations to a solution. Overestimation, however, is a potential drawback of verified methods. For some problems, the computed error bounds become overly pessimistic, or integration even breaks down. The dependency problem and the wrapping effect are particular sources of overestimations in interval computations. Berz (see [1]) and his co-workers have developed Taylor model methods, which extend interval arithmetic with symbolic computations. The latter is an ef
... Show MoreAbstract:
This research aims to compare Bayesian Method and Full Maximum Likelihood to estimate hierarchical Poisson regression model.
The comparison was done by simulation using different sample sizes (n = 30, 60, 120) and different Frequencies (r = 1000, 5000) for the experiments as was the adoption of the Mean Square Error to compare the preference estimation methods and then choose the best way to appreciate model and concluded that hierarchical Poisson regression model that has been appreciated Full Maximum Likelihood Full Maximum Likelihood with sample size (n = 30) is the best to represent the maternal mortality data after it has been reliance value param
... Show MoreA study has been performed to compare the beddings in which ductile iron pipes are buried. In water transmission systems, bends are usually used in the pipes. According to the prescribed layout, at these bends, unbalanced thrust forces are generated that must be confronted to prevent the separation of the bend from the pipe. The bed condition is a critical and important factor in providing the opposite force to the thrust forces in the restraint joint system. Due to the interaction between the native soil and the bedding layers in which the pipe is buried and the different characteristics between them. Also, the interaction with the pipe material makes it difficult to calculate the real forces opposite to the thrust forces and the way they
... Show MoreThis paper is concerned with introducing and studying the first new approximation operators using mixed degree system and second new approximation operators using mixed degree system which are the core concept in this paper. In addition, the approximations of graphs using the operators first lower and first upper are accurate then the approximations obtained by using the operators second lower and second upper sincefirst accuracy less then second accuracy. For this reason, we study in detail the properties of second lower and second upper in this paper. Furthermore, we summarize the results for the properties of approximation operators second lower and second upper when the graph G is arbitrary, serial 1, serial 2, reflexive, symmetric, tra
... Show More