Background: Many types of instruments and techniques are used in the instrumentation of the root canal system. These instruments and techniques may extrude debris beyond the apical foramen and may cause post-instrumentation complications. The aim of this study was to evaluate the amount of apically extruded debris resulted by using 4 types of nickel-titanium instruments (WaveOne, TRUShape 3D conforming files, Hyflex CM, and One Shape files) during endodontic instrumentation. Materials and methods: Forty freshly extracted human mandibular second premolar with straight canals and a single apex were collected for this study. All teeth were cut to similar lengths. Pre-weighted glass vials were used as collecting containers. Samples were randomly divided into four groups with 10 samples in each group: Group A instrumentation by WaveOne reciprocating file, Group B instrumentation by TRUShape 3D rotating files, Group C instrumentation by Hyflex CM rotating files and Group D instrumentation by One Shape rotating file. A total volume of 7 ml of sodium hypochlorite was used for irrigation in each sample. Apical patency confirmed and maintained by a size #15 K-File. All canals were instrumented up to a size #25. After completion of endodontic instrumentation, vials were then stored in an incubator for 5 days at 68o C for dryness. Then vials are weighted again, and the pre-weight subtracted from the post-weight, the weight difference resembled the amount of apically extruded debris from the apical foramen during root canal instrumentation. Data obtained were statistically analysed by using ANOVA and LSD tests. Results: The results showed that the Hyflex CM Group (C) has statistical significant lowest apically extruded debris as compared to other groups of this study (P ≤0.05), while the TRUShape Group (B) has statistical significant lowest apically extruded debris as compared to One Shape Group (D) and WaveOne Group (A), while the WaveOne Group (A) showed the highest value of apically extruded debris (p ≤0.01). The result showed that all groups resulted in apical extrusion of debris. Significance: Although all systems caused apical extrusion of debris and irrigant, continuous rotary instrumentation was associated with less extrusion as compared with the use of reciprocating file system.
Tanuma and Zubair formations are known as the most problematic intervals in Zubair Oilfield, and they cause wellbore instability due to possible shale-fluid interaction. It causes a vast loss of time dealing with various downhole problems (e.g., stuck pipe) which leads to an increase in overall well cost for the consequences (e.g., fishing and sidetrack). This paper aims to test shale samples with various laboratory tests for shale evaluation and drilling muds development. Shale's physical properties are described by using a stereomicroscope and the structures are observed with Scanning Electron Microscope. The shale reactivity and behavior are analyzed by using the cation exchange capacity testing and the capillary suction test is
... Show MoreModerately, advanced national election technologies have improved political systems. As electronic voting (e-voting) systems advance, security threats like impersonation, ballot tampering, and result manipulation increase. These challenges are addressed through a review covering biometric authentication, watermarking, and blockchain technologies, each of which plays a crucial role in improving the security of e-voting systems. More precisely, the biometric authentication is being examined due to its ability in identify the voters and reducing the risks of impersonation. The study also explores the blockchain technology to decentralize the elections, enhance the transparency and ensure the prevention of any unauthorized alteration or
... Show MoreCommunication of the human brain with the surroundings became reality by using Brain- Computer Interface (BCI) based mechanism. Electroencephalography (EEG) being the non-invasive method has become popular for interaction with the brain. Traditionally, the devices were used for clinical applications to detect various brain diseases but with the advancement in technologies, companies like Emotiv, NeuoSky are coming up with low cost, easily portable EEG based consumer graded devices that can be used in various application domains like gaming, education etc as these devices are comfortable to wear also. This paper reviews the fields where the EEG has shown its impact and the way it has p
Semi-parametric models analysis is one of the most interesting subjects in recent studies due to give an efficient model estimation. The problem when the response variable has one of two values either 0 ( no response) or one – with response which is called the logistic regression model.
We compare two methods Bayesian and . Then the results were compared using MSe criteria.
A simulation had been used to study the empirical behavior for the Logistic model , with different sample sizes and variances. The results using represent that the Bayesian method is better than the at small samples sizes.
... Show MoreIn this research, Artificial Neural Networks (ANNs) technique was applied in an attempt to predict the water levels and some of the water quality parameters at Tigris River in Wasit Government for five different sites. These predictions are useful in the planning, management, evaluation of the water resources in the area. Spatial data along a river system or area at different locations in a catchment area usually have missing measurements, hence an accurate prediction. model to fill these missing values is essential.
The selected sites for water quality data prediction were Sewera, Numania , Kut u/s, Kut d/s, Garaf observation sites. In these five sites models were built for prediction of the water level and water quality parameters.
Reservoir characterization is an important component of hydrocarbon exploration and production, which requires the integration of different disciplines for accurate subsurface modeling. This comprehensive research paper delves into the complex interplay of rock materials, rock formation techniques, and geological modeling techniques for improving reservoir quality. The research plays an important role dominated by petrophysical factors such as porosity, shale volume, water content, and permeability—as important indicators of reservoir properties, fluid behavior, and hydrocarbon potential. It examines various rock cataloging techniques, focusing on rock aggregation techniques and self-organizing maps (SOMs) to identify specific and
... Show MoreIn information security, fingerprint verification is one of the most common recent approaches for verifying human identity through a distinctive pattern. The verification process works by comparing a pair of fingerprint templates and identifying the similarity/matching among them. Several research studies have utilized different techniques for the matching process such as fuzzy vault and image filtering approaches. Yet, these approaches are still suffering from the imprecise articulation of the biometrics’ interesting patterns. The emergence of deep learning architectures such as the Convolutional Neural Network (CNN) has been extensively used for image processing and object detection tasks and showed an outstanding performance compare
... Show MoreThe objective of this research paper is two-fold. The first is a precise reading of the theoretical underpinnings of each of the strategic approaches: "Market approach" for (M. Porter), and the alternative resource-based approach (R B V), advocates for the idea that the two approaches are complementary. Secondly, we will discuss the possibility of combining the two competitive strategies: cost leadership and differentiation. Finally, we propose a consensual approach that we call "dual domination".
Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for