Background. “Polyetheretherketone (PEEK)” is a biocompatible, high-strength polymer that is well-suited for use in dental applications due to its unique properties. However, achieving good adhesion between PEEK and hydrophilic materials such as dental adhesives or cement can be challenging. Also, this hydrophobicity may affect the use of PEEK as an implant material. Surface treatment or conditioning is often necessary to improve surface properties. The piranha solution is the treatment of choice to be explored for this purpose. Methods. PEEK disks of 10 mm diameter and 2 mm thickness were used in this study. Those samples were divided into five groups (each group has five samples). The first is the control group, in which no acid treatment was used; the second group undergoes sulfuric acid treatment. The remaining three groups were treated with Piranha solution; each group used a different concentration (1 : 3, 1 : 5, and 1 : 7 hydrogen peroxide to sulfuric acid, respectively). The period of treatment was 60 s for all groups. Wettability and surface roughness tests were done for the five groups. In statistical analysis, Shapiro–Wilk test was used to check the assumption of normality and to determine the statistical significance among groups; a one-way analysis of variance was employed. Subsequently, for multiple comparisons, Tukey’s honestly significant difference post hoc test was performed. Results. The Piranha solution treatment groups showed a higher wettability compared to the control group and the group treated with sulfuric acid. Additionally, the Piranha solution treatment with a higher concentration of hydrogen peroxide (1 : 3) resulted in greater improvement in surface roughness compared to the control group and the lower concentration groups (1 : 5 and 1 : 7), while the sulfuric acid treated group showed the highest surface roughness. Conclusion. The results of this study suggest that the piranha solution can be an effective method for improving the surface characteristics of PEEK to be used in different dental applications, especially as a dental implant material, due to the increase in wettability and surface roughness.
In this paper, the Reliability Analysis with utilizing a Monte Carlo simulation (MCS) process was conducted on the equation of the collapse potential predicted by ANN to study its reliability when utilized in a situation of soil that has uncertainty in its properties. The prediction equation utilized in this study was developed previously by the authors. The probabilities of failure were then plotted against a range of uncertainties expressed in terms of coefficient of variation. As a result of reliability analysis, it was found that the collapse potential equation showed a high degree of reliability in case of uncertainty in gypseous sandy soil properties within the specified coefficient of variation (COV) for each property. When t
... Show MoreNeural cryptography deals with the problem of “key exchange” between two neural networks by using the mutual learning concept. The two networks exchange their outputs (in bits) and the key between two communicating parties ar eventually represented in the final learned weights, when the two networks are said to be synchronized. Security of neural synchronization is put at risk if an attacker is capable of synchronizing with any of the two parties during the training process.
Diabetes is one of the increasing chronic diseases, affecting millions of people around the earth. Diabetes diagnosis, its prediction, proper cure, and management are compulsory. Machine learning-based prediction techniques for diabetes data analysis can help in the early detection and prediction of the disease and its consequences such as hypo/hyperglycemia. In this paper, we explored the diabetes dataset collected from the medical records of one thousand Iraqi patients. We applied three classifiers, the multilayer perceptron, the KNN and the Random Forest. We involved two experiments: the first experiment used all 12 features of the dataset. The Random Forest outperforms others with 98.8% accuracy. The second experiment used only five att
... Show MoreShadow detection and removal is an important task when dealing with color outdoor images. Shadows are generated by a local and relative absence of light. Shadows are, first of all, a local decrease in the amount of light that reaches a surface. Secondly, they are a local change in the amount of light rejected by a surface toward the observer. Most shadow detection and segmentation methods are based on image analysis. However, some factors will affect the detection result due to the complexity of the circumstances. In this paper a method of segmentation test present to detect shadows from an image and a function concept is used to remove the shadow from an image.
Eye Detection is used in many applications like pattern recognition, biometric, surveillance system and many other systems. In this paper, a new method is presented to detect and extract the overall shape of one eye from image depending on two principles Helmholtz & Gestalt. According to the principle of perception by Helmholz, any observed geometric shape is perceptually "meaningful" if its repetition number is very small in image with random distribution. To achieve this goal, Gestalt Principle states that humans see things either through grouping its similar elements or recognize patterns. In general, according to Gestalt Principle, humans see things through genera
... Show MoreThe penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreCompressing the speech reduces the data storage requirements, leading to reducing the time of transmitting the digitized speech over long-haul links like internet. To obtain best performance in speech compression, wavelet transforms require filters that combine a number of desirable properties, such as orthogonality and symmetry.The MCT bases functions are derived from GHM bases function using 2D linear convolution .The fast computation algorithm methods introduced here added desirable features to the current transform. We further assess the performance of the MCT in speech compression application. This paper discusses the effect of using DWT and MCT (one and two dimension) on speech compression. DWT and MCT performances in terms of comp
... Show MoreThe study of the validity and probability of failure in solids and structures is highly considered as one of the most incredibly-highlighted study fields in many science and engineering applications, the design analysts must therefore seek to investigate the points where the failing strains may be occurred, the probabilities of which these strains can cause the existing cracks to propagate through the fractured medium considered, and thereafter the solutions by which the analysts can adopt the approachable techniques to reduce/arrest these propagating cracks.In the present study a theoretical investigation upon simply-supported thin plates having surface cracks within their structure is to be accomplished, and the applied impact load to the
... Show MoreThe research involved a rapid, automated and highly accurate developed CFIA/MZ technique for estimation of phenylephrine hydrochloride (PHE) in pure, dosage forms and biological sample. This method is based on oxidative coupling reaction of 2,4-dinitrophenylhydrazine (DNPH) with PHE in existence of sodium periodate as oxidizing agent in alkaline medium to form a red colored product at ʎmax )520 nm (. A flow rate of 4.3 mL.min-1 using distilled water as a carrier, the method of FIA proved to be as a sensitive and economic analytical tool for estimation of PHE.
Within the concentration range of 5-300 μg.mL-1, a calibration curve was rectilinear, where the detection limit was 3.252 μg.mL
Free Space Optics (FSO) plays a vital role in modern wireless communications due to its advantages over fiber optics and RF techniques where a transmission of huge bandwidth and access to remote places become possible. The specific aim of this research is to analyze the Bit-Error Rate (BER) for FSO communication system when the signal is sent the over medium of turbulence channel, where the fading channel is described by the Gamma-Gamma model. The signal quality is improved by using Optical Space-Time Block- Code (OSTBC) and then the BER will be reduced. Optical 2×2 Alamouti scheme required 14 dB bit energy to noise ratio (Eb/N0) at 10-5 bit error rate (BER) which gives 3.5 dB gain as compared to no diversity scheme. Th
... Show More