Background: techniques of image analysis have been used extensively to minimize interobserver variation of immunohistochemical scoring, yet; image acquisition procedures are often demanding, expensive and laborious. This study aims to assess the validity of image analysis to predict human observer’s score with a simplified image acquisition technique. Materials and methods: formalin fixed- paraffin embedded tissue sections for ameloblastomas and basal cell carcinomas were immunohistochemically stained with monoclonal antibodies to MMP-2 and MMP-9. The extent of antibody positivity was quantified using Imagej® based application on low power photomicrographs obtained with a conventional camera. Results of the software were employed to predict human visual scoring results with stepwise multiple regression analysis. Results: the overall prediction of epithelial score depicted as r square value was 0.26 (p<0.001) which was obviously higher than that of stromal score (0.10; p<0.01). Epithelial and stromal MMP-2 score prediction was generally higher than that of MMP-9. Collectively, ameloblastomas had a more efficient score prediction compared to basal cell carcinomas. Conclusion: there is a considerable variability in the prediction capacity of the technique with respect to different antibodies, different tumors and cellular versus stromal score.
Correlation equations for expressing the boiling temperature as direct function of liquid composition have been tested successfully and applied for predicting azeotropic behavior of multicomponent mixtures and the kind of azeotrope (minimum, maximum and saddle type) using modified correlation of Gibbs-Konovalov theorem. Also, the binary and ternary azeotropic point have been detected experimentally using graphical determination on the basis of experimental binary and ternary vapor-liquid equilibrium data.
In this study, isobaric vapor-liquid equilibrium for two ternary systems: “1-Propanol – Hexane – Benzene” and its binaries “1-Propanol –
... Show MoreMachine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To
... Show MoreAbstract
In this paper, fatigue damage accumulation were studied using many methods i.e.Corton-Dalon (CD),Corton-Dalon-Marsh(CDM), new non-linear model and experimental method. The prediction of fatigue lifetimes based on the two classical methods, Corton-Dalon (CD)andCorton-Dalon-Marsh (CDM), are uneconomic and non-conservative respectively. However satisfactory predictions were obtained by applying the proposed non-linear model (present model) for medium carbon steel compared with experimental work. Many shortcomings of the two classical methods are related to their inability to take into account the surface treatment effect as shot peening. It is clear that the new model shows that a much better and cons
... Show MoreIncreasing material prices coupled with the emission of hazardous gases through the production and construction of Hot Mix Asphalt (HMA) has driven a strong movement toward the adoption of sustainable construction technology. Warm Mix Asphalt (WMA) is considered relatively a new technology, which enables the production and compaction of asphalt concrete mixtures at temperatures 15-40 °C lower than that of traditional hot mix asphalt. The Resilient modulus (Mr) which can be defined as the ratio of axial pulsating stress to the corresponding recoverable strain, is used to evaluate the relative quality of materials as well as to generate input for pavement design or pavement evaluation and analysis. Based on the aforementioned preface, it is
... Show MoreProjects suspensions are between the most insistent tasks confronted by the construction field accredited to the sector’s difficulty and its essential delay risk foundations’ interdependence. Machine learning provides a perfect group of techniques, which can attack those complex systems. The study aimed to recognize and progress a wellorganized predictive data tool to examine and learn from delay sources depend on preceding data of construction projects by using decision trees and naïve Bayesian classification algorithms. An intensive review of available data has been conducted to explore the real reasons and causes of construction project delays. The results show that the postpo
Lost circulation or losses in drilling fluid is one of the most important problems in the oil and gas industry, and it appeared at the beginning of this industry, which caused many problems during the drilling process, which may lead to closing the well and stopping the drilling process. The drilling muds are relatively expensive, especially the muds that contain oil-based mud or that contain special additives, so it is not economically beneficial to waste and lose these muds. The treatment of drilling fluid losses is also somewhat expensive as a result of the wasted time that it caused, as well as the high cost of materials used in the treatment such as heavy materials, cement, and others. The best way to deal with drilling fluid losses
... Show MoreAbstract Additive manufacturing has been recently emerged as an adaptable production process that can fundamentally affect traditional manufacturing in the future. Due to its manufacturing strategy, selective laser melting (SLM) is suitable for complicated configurations. Investigating the potential effects of scanning speed and laser power on the porosity, corrosion resistance and hardness of AISI 316L stainless steel produced by SLM is the goal of this work. When compared to rolled stainless steel, the improvement is noticeable. To examine the microstructure of the samples, the optical microscopy (OM), scanning electron microscopy (SEM), and EDX have been utilized. Hardness and tensile strength were us
... Show MoreThe current study uses the flame fragment deposition (FFD) method to synthesize carbon nanotubes (CNTs) from Iraqi liquefied petroleum gas (LPG), which is used as a carbon source. To carry out the synthesis steps, a homemade reactor was used. To eliminate amorphous impurities, the CNTs were sonicated in a 30 percent hydrogen peroxide (H2O2) solution at ambient temperature. To remove the polycyclic aromatic hydrocarbons (PAHs) generated during LPG combustion, sonication in an acetone bath is used. The produced products were investigated and compared with standard Multi-walled carbon nanotube MWCNTs (95%), Sigma, Aldrich, using X-ray diffraction (XRD), thermo gravimetric analysis (TGA), Raman spectroscopy, scanning el
... Show MoreA simple and rapid spectrophotometric method for the determination of sulphite SO3-2 is described. The method is based on the rapid reduction of known amount of chromate CrO4-2 in the presence of sulphite in acidic medium of 2N H2SO4. The amount of excess of chromate was measured after it reactions with 1,5-diphenylcarbazide which finally gives a pink-violet, water soluble and stable complex, which exhibit a maximum absorption at 542 nm. Beer's law was obeyed in the concentration range from 0.004-6.0 µg of sulphite in a final volume of 25 ml with a molar absorbtivity of 4.64×104 l.mol-1.cm-1, Sandal's sensitivity index of 0.001724 ?g .cm-2 and relative standard deviation of ±0.55 - ±0.83 depending on the concentration level. The present
... Show MoreHeart sound is an electric signal affected by some factors during the signal's recording process, which adds unwanted information to the signal. Recently, many studies have been interested in noise removal and signal recovery problems. The first step in signal processing is noise removal; many filters are used and proposed for treating this problem. Here, the Hankel matrix is implemented from a given signal and tries to clean the signal by overcoming unwanted information from the Hankel matrix. The first step is detecting unwanted information by defining a binary operator. This operator is defined under some threshold. The unwanted information replaces by zero, and the wanted information keeping in the estimated matrix. The resulting matrix
... Show More