Objective: To conduct a standardized method for cavity preparation on the palatal surface of rat maxillary molars and to introduce a standardized method for tooth correct alignment within the specimen during the wax embedding procedure to better detect cavity position within the examined slides. Materials and methods: Six male Wistar rats, aged 4-6 weeks, were used. The maxillary molars of three animals were sectioned in the frontal plane to identify the thickness of hard tissue on the palatal surface of the first molar which was (250-300µm). The end-cutting bur (with a cutting head diameter of 0.2mm) was suitable for preparing a dentinal cavity (70-80µm) depth. Cavity preparation was then performed using the same bur on the tooth surface in the other three animals. Rats are then euthanized before dissecting, fixing, and demineralizing the teeth. For better alignment of teeth samples during the waxing procedure, K-file endodontic instrument size #8 was dipped in Indian ink. The file tip was inserted on the jaw bone at the buccal side of the tooth in a region opposed to the prepared cavity on the palatal side. Moreover, a small Dycal applicator instrument was used to mark the jaw bone on the mesial side of teeth samples as an orientation for the cutting surface. Results: Well-defined sections were obtained with a clear cavity extension within dentin and without any signs of pulp exposure in all samples. Conclusion: This pilot was conducted to perform an easy procedure for cavity preparation in rat molar teeth to obtain a clear histopathological section.
Often phenomena suffer from disturbances in their data as well as the difficulty of formulation, especially with a lack of clarity in the response, or the large number of essential differences plaguing the experimental units that have been taking this data from them. Thus emerged the need to include an estimation method implicit rating of these experimental units using the method of discrimination or create blocks for each item of these experimental units in the hope of controlling their responses and make it more homogeneous. Because of the development in the field of computers and taking the principle of the integration of sciences it has been found that modern algorithms used in the field of Computer Science genetic algorithm or ant colo
... Show MoreOne of the most important features of the Amazon Web Services (AWS) cloud is that the program can be run and accessed from any location. You can access and monitor the result of the program from any location, saving many images and allowing for faster computation. This work proposes a face detection classification model based on AWS cloud aiming to classify the faces into two classes: a non-permission class, and a permission class, by training the real data set collected from our cameras. The proposed Convolutional Neural Network (CNN) cloud-based system was used to share computational resources for Artificial Neural Networks (ANN) to reduce redundant computation. The test system uses Internet of Things (IoT) services through our ca
... Show MoreThere is confusion between the concept of honesty and credibility arguing that their meaning is the same. ‘Credibility; is derived from the truth which means evidence of honesty, while ‘honesty’ means not lying and matching reality. The study of credibility begins globally at the end of the fifties of the second millennium to see the decline and refrain from reading newspapers, while it was studied in the Arab world in 1987. Global studies find several meanings of the concept of ‘credibility’ such as accuracy, completeness, transfer facts, impartiality, balance, justice, objectivity, trust, honesty, respect the freedom of individuals and community, and taking into account the traditions and norms.
Credibility has two dimens
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreIn the present paper, an eco-epidemiological model consisting of diseased prey consumed by a predator with fear cost, and hunting cooperation property is formulated and studied. It is assumed that the predator doesn’t distinguish between the healthy prey and sick prey and hence it consumed both. The solution’s properties such as existence, uniqueness, positivity, and bounded are discussed. The existence and stability conditions of all possible equilibrium points are studied. The persistence requirements of the proposed system are established. The bifurcation analysis near the non-hyperbolic equilibrium points is investigated. Numerically, some simulations are carried out to validate the main findings and obtain the critical values of th
... Show MoreOne of the most important features of the Amazon Web Services (AWS) cloud is that the program can be run and accessed from any location. You can access and monitor the result of the program from any location, saving many images and allowing for faster computation. This work proposes a face detection classification model based on AWS cloud aiming to classify the faces into two classes: a non-permission class, and a permission class, by training the real data set collected from our cameras. The proposed Convolutional Neural Network (CNN) cloud-based system was used to share computational resources for Artificial Neural Networks (ANN) to reduce redundant computation. The test system uses Internet of Things (IoT) services th
... Show MoreConstruction contractors usually undertake multiple construction projects simultaneously. Such a situation involves sharing different types of resources, including monetary, equipment, and manpower, which may become a major challenge in many cases. In this study, the financial aspects of working on multiple projects at a time are addressed and investigated. The study considers dealing with financial shortages by proposing a multi-project scheduling optimization model for profit maximization, while minimizing the total project duration. Optimization genetic algorithm and finance-based scheduling are used to produce feasible schedules that balance the finance of activities at any time w