Background: The present study was carried out to compare shear bond strength of sapphire bracket bonded to zirconium surface after using different methods of surface conditioning and assessment of the adhesive remnant index. Materials and methods: The sample composed of 40 zirconium specimens divided into four groups; the first group was the control, the second group was conditioned by sandblast with aluminum oxide particle 50 μm, the third and fourth group was treated by (Nd: YAG) laser (1064nm)(0.888 Watt for 5 seconds) for the 1st laser group and (0.444 Watt for 10 seconds) for the 2nd laser group. All samples were coated by z-prime plus primer. A central incisor sapphire bracket was bonded to all samples with light cure adhesive resin. Shear bond strength was measured by using Tinius Olsen universal testing machine. After debonding, each bracket and zirconium surface were examined and adhesive remnant index was registered. The difference in shear bond strength among groups was analyzed by using ANOVA test. The adhesive remnant index was assessed using Chi-square test. Results: The 2nd laser group had the highest mean value of shear bond strength then the 1st laser group followed by the sandblasting group, while the control group had the least value, non-significant difference in the shear bond strength was found between the laser groups and highly significant difference was found between all other comparable groups. Non-significant difference in the site of bond failure was found between the laser groups and sandblasting group, and between the two laser groups. Conclusion: The laser conditioning method showed higher value of shear bond strength than the sandblasting conditioning method.
Abstract
The objective of this research is to identify the analysis of the ethics of the administration in the development of the social responsibility of one government organizations, and to achieve the objectives of the research was the use of a questionnaire developed for the purpose of data collection and distribution to the research sample, was chosen as a total sample population (50) individuals were relying on statistical package to do a statistical analysis for this research, user, ANSI (SPSS) simple regression analysis, standard deviation, Pearson correlation coefficient.
Research findings show the role of social responsibility in achieving the university's strategy,
... Show MoreThe research aims to demonstrate the impact of tax techniques on the quality of services provided to income taxpayers by studying the correlational and influencing relationships between the exploited variable (tax techniques) and the dependent variable (the quality of services provided to income taxpayers), and in line with the research objectives, the main hypothesis of the research was formulated (there is a relationship Significance between tax techniques and the quality of services provided to income taxpayers) a number of sub-hypotheses emerged from this hypothesis that were stated in the research methodology, and a number of conclusions were reached, the most important of which were (through the use of the correlation coeff
... Show MoreIn this paper, three approximate methods namely the Bernoulli, the Bernstein, and the shifted Legendre polynomials operational matrices are presented to solve two important nonlinear ordinary differential equations that appeared in engineering and applied science. The Riccati and the Darcy-Brinkman-Forchheimer moment equations are solved and the approximate solutions are obtained. The methods are summarized by converting the nonlinear differential equations into a nonlinear system of algebraic equations that is solved using Mathematica®12. The efficiency of these methods was investigated by calculating the root mean square error (RMS) and the maximum error remainder (𝑀𝐸𝑅n) and it was found that the accuracy increases with increasi
... Show MoreClassification of imbalanced data is an important issue. Many algorithms have been developed for classification, such as Back Propagation (BP) neural networks, decision tree, Bayesian networks etc., and have been used repeatedly in many fields. These algorithms speak of the problem of imbalanced data, where there are situations that belong to more classes than others. Imbalanced data result in poor performance and bias to a class without other classes. In this paper, we proposed three techniques based on the Over-Sampling (O.S.) technique for processing imbalanced dataset and redistributing it and converting it into balanced dataset. These techniques are (Improved Synthetic Minority Over-Sampling Technique (Improved SMOTE), Border
... Show MoreIn this paper, the methods of weighted residuals: Collocation Method (CM), Least Squares Method (LSM) and Galerkin Method (GM) are used to solve the thin film flow (TFF) equation. The weighted residual methods were implemented to get an approximate solution to the TFF equation. The accuracy of the obtained results is checked by calculating the maximum error remainder functions (MER). Moreover, the outcomes were examined in comparison with the 4th-order Runge-Kutta method (RK4) and good agreements have been achieved. All the evaluations have been successfully implemented by using the computer system Mathematica®10.
Abstract:
The phenomenon of financial failure is one of the phenomena that requires special attention and in-depth study due to its significant impact on various parties, whether they are internal or external and those who benefit from financial performance reports. With the increase in cases of bankruptcy and default facing companies and banks, interest has increased in understanding the reasons that led to this financial failure. This growing interest should be a reason to develop models and analytical methods that help in the early detection of this increasing phenomenon in recent year . The research examines the use of
... Show MoreExcessive skewness which occurs sometimes in the data is represented as an obstacle against normal distribution. So, recent studies have witnessed activity in studying the skew-normal distribution (SND) that matches the skewness data which is regarded as a special case of the normal distribution with additional skewness parameter (α), which gives more flexibility to the normal distribution. When estimating the parameters of (SND), we face the problem of the non-linear equation and by using the method of Maximum Likelihood estimation (ML) their solutions will be inaccurate and unreliable. To solve this problem, two methods can be used that are: the genetic algorithm (GA) and the iterative reweighting algorithm (IR) based on the M
... Show More