Background: To investigate the effect of different types of storage media on enamel surface microstructure of avulsed teeth by using atomic force microscope.Materials and methods : Twelve teeth blocks from freshly extracted premolars for orthodontic treatment were selected . The study samples were divided into three groups according to type of storage media :A-egg white , B- probiotic yogurt , and C-bovine milk . All the samples were examined for changes in surface roughness and surface granularity distribution using atomic force microscope, at two periods: baseline, and after 8 hours of immersing in the three types of storage media. Results: Milk group had showed a significant increase in the mean of the roughness values at the test period, while the egg white and the probiotic yogurt groups showed decrease in the surface roughness at the test period. No significant changes was found in the grain size of enamel surface of the avulsed tooth in any types of three storage media at eight hours interval. The use of egg white and probiotic yogurt to store the samples may be beneficial in that they contain various ions and proteins that fill up enamel valleys, while the longer periods of milk exposure encourage the bacteria to continue fermenting lactose, resulting in continual acid generation and increased demineralization. Conclusion : Milk group demonstrated the highest roughness values, while the egg white group demonstrated the lowest roughness values of the teeth. No significant changes in the grain size of enamel surface of the tested teeth in any types of three storage media at eight hours interval .
Abstract:
The main objective of the research is to build an optimal investment portfolio of stocks’ listed at the Iraqi Stock Exchange after employing the multi-objective genetic algorithm within the period of time between 1/1/2006 and 1/6/2018 in the light of closing prices (43) companies after the completion of their data and met the conditions of the inspection, as the literature review has supported the diagnosis of the knowledge gap and the identification of deficiencies in the level of experimentation was the current direction of research was to reflect the aspects of the unseen and untreated by other researchers in particular, the missing data and non-reversed pieces the reality of trading at the level of compani
... Show MoreProblem: Cancer is regarded as one of the world's deadliest diseases. Machine learning and its new branch (deep learning) algorithms can facilitate the way of dealing with cancer, especially in the field of cancer prevention and detection. Traditional ways of analyzing cancer data have their limits, and cancer data is growing quickly. This makes it possible for deep learning to move forward with its powerful abilities to analyze and process cancer data. Aims: In the current study, a deep-learning medical support system for the prediction of lung cancer is presented. Methods: The study uses three different deep learning models (EfficientNetB3, ResNet50 and ResNet101) with the transfer learning concept. The three models are trained using a
... Show MoreDie Zusammenfassung :
In der Forschung „ Die Wirkungen der Nachkriegliteratur „ findet man einen Überblick über die Wirkungen der Literatur im Allgemeinen und besonders über die Wirkungen der Nachkriegliteratur in Deutschland. Das Kurzgeschichte „ das Brot“ von Wolfgang Borchert „ wird in dieser Forschung als Beispiel analysiuert, weil sie die Nachteile des Kriegs und ihre Wirkungen auf die Familienbeziehungen behandelt. Die Hauptfiguren der Kurzgeschichte sind ein alter Mann und seine Frau. Der Schriftsteller bestimmt die Rolle der Frauen in Deutschland. Die Frau in dieser Geschichte ist stärker als ihren Mann , denn er war lügner und schwach.
Post-war Literature has an e
... Show MoreIn this research, the nonparametric technique has been presented to estimate the time-varying coefficients functions for the longitudinal balanced data that characterized by observations obtained through (n) from the independent subjects, each one of them is measured repeatedly by group of specific time points (m). Although the measurements are independent among the different subjects; they are mostly connected within each subject and the applied techniques is the Local Linear kernel LLPK technique. To avoid the problems of dimensionality, and thick computation, the two-steps method has been used to estimate the coefficients functions by using the two former technique. Since, the two-
... Show MoreThe problem of Bi-level programming is to reduce or maximize the function of the target by having another target function within the constraints. This problem has received a great deal of attention in the programming community due to the proliferation of applications and the use of evolutionary algorithms in addressing this kind of problem. Two non-linear bi-level programming methods are used in this paper. The goal is to achieve the optimal solution through the simulation method using the Monte Carlo method using different small and large sample sizes. The research reached the Branch Bound algorithm was preferred in solving the problem of non-linear two-level programming this is because the results were better.
The study investigates the relationship between the volatility of the Iraqi Stock Exchange Index (ISX), and the volatility of global oil prices benchmarks, Brent and West Intermediate Texas (WTI), in additional to the Iraqi Oil, Basra Crude Light (BSL) which represents the most exported Iraqi oil and the major influential factor on the Iraqi governmental revenues. Using monthly data covering the period: 1/2005-12/1205, econometrical and technical tools represented by Co-incretion, Vector Error Correction Model – VECM, Granger Causality, and Bollinger band were employed in order to explore the relationship between the variables.
The econometric analysis revealed the impact of the oil prices volatility on
... Show MoreA resume is the first impression between you and a potential employer. Therefore, the importance of a resume can never be underestimated. Selecting the right candidates for a job within a company can be a daunting task for recruiters when they have to review hundreds of resumes. To reduce time and effort, we can use NLTK and Natural Language Processing (NLP) techniques to extract essential data from a resume. NLTK is a free, open source, community-driven project and the leading platform for building Python programs to work with human language data. To select the best resume according to the company’s requirements, an algorithm such as KNN is used. To be selected from hundreds of resumes, your resume must be one of the best. Theref
... Show MoreThe study focused on the treatment of real oilfield produced water from the East Baghdad field affiliated to the Midland Oil Company (Iraq) using an oil skimming process followed by a coagulation/flocculation process for zero liquid discharge system applications. Belt type oil skimmer was utilized for evaluating the process efficiency with various operating conditions such as temperature (17-40 °C) and time (0.5-2.5 hr.). Polyaluminum chloride (PAC) coagulant and polyacrylamide (PAM) flocculant was used to investigate the performance of the coagulation/flocculation process with PAC dosage (5-90 ppm) and pH (5-10) as operating conditions. In the skimming process, the oil content, COD, turbidity, and TSS decreased with an increase in tempera
... Show More