Brachytherapy treatment is primarily used for the certain handling kinds of cancerous tumors. Using radionuclides for the study of tumors has been studied for a very long time, but the introduction of mathematical models or radiobiological models has made treatment planning easy. Using mathematical models helps to compute the survival probabilities of irradiated tissues and cancer cells. With the expansion of using HDR-High dose rate Brachytherapy and LDR-low dose rate Brachytherapy for the treatment of cancer, it requires fractionated does treatment plan to irradiate the tumor. In this paper, authors have discussed dose calculation algorithms that are used in Brachytherapy treatment planning. Precise and less time-consuming calculations using 3D dose distribution for the patient is one of the important necessities in modern radiation oncology. For this it is required to have accurate algorithms which help in TPS. There are certain limitations with the algorithm which are used for calculating the dose. This work is done to evaluate the correctness of five algorithms that are presently employed for treatment planning, including pencil beam convolution (PBC), superposition (SP), anisotropic analytical algorithm (AAA), Monte Carlo (MC), Clarkson Method, Fast Fourier Transform, Convolution method. The algorithms used in radiotherapy treatment planning are categorized as correction‐based and model‐based.
Document clustering is the process of organizing a particular electronic corpus of documents into subgroups of similar text features. Formerly, a number of conventional algorithms had been applied to perform document clustering. There are current endeavors to enhance clustering performance by employing evolutionary algorithms. Thus, such endeavors became an emerging topic gaining more attention in recent years. The aim of this paper is to present an up-to-date and self-contained review fully devoted to document clustering via evolutionary algorithms. It firstly provides a comprehensive inspection to the document clustering model revealing its various components with its related concepts. Then it shows and analyzes the principle research wor
... Show MoreAbstract— The growing use of digital technologies across various sectors and daily activities has made handwriting recognition a popular research topic. Despite the continued relevance of handwriting, people still require the conversion of handwritten copies into digital versions that can be stored and shared digitally. Handwriting recognition involves the computer's strength to identify and understand legible handwriting input data from various sources, including document, photo-graphs and others. Handwriting recognition pose a complexity challenge due to the diversity in handwriting styles among different individuals especially in real time applications. In this paper, an automatic system was designed to handwriting recognition
... Show MoreGenetic algorithms (GA) are a helpful instrument for planning and controlling the activities of a project. It is based on the technique of survival of the fittest and natural selection. GA has been used in different sectors of construction and building however that is rarely documented. This research aimed to examine the utilisation of genetic algorithms in construction project management. For this purpose, the research focused on the benefits and challenges of genetic algorithms, and the extent to which genetic algorithms is utilised in construction project management. Results showed that GA provides an ability of generating near optimal solutions which can be adopted to reduce complexity in project management and resolve difficult problem
... Show MoreToday with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned
Enhancing quality image fusion was proposed using new algorithms in auto-focus image fusion. The first algorithm is based on determining the standard deviation to combine two images. The second algorithm concentrates on the contrast at edge points and correlation method as the criteria parameter for the resulted image quality. This algorithm considers three blocks with different sizes at the homogenous region and moves it 10 pixels within the same homogenous region. These blocks examine the statistical properties of the block and decide automatically the next step. The resulted combined image is better in the contras
... Show MoreSemi-parametric models analysis is one of the most interesting subjects in recent studies due to give an efficient model estimation. The problem when the response variable has one of two values either 0 ( no response) or one – with response which is called the logistic regression model.
We compare two methods Bayesian and . Then the results were compared using MSe criteria.
A simulation had been used to study the empirical behavior for the Logistic model , with different sample sizes and variances. The results using represent that the Bayesian method is better than the at small samples sizes.
... Show MoreRESRAD is a computer model designed to estimate risks and radiation doses from residual radioactive materials in soil. Thirty seven soil samples were collected from the area around the berms of Al-Tuwaitha site and two samples as background taken from an area about 3 km north of the site. The samples were measured by gamma-ray spectrometry system using high purity germanium (HPGe) detector. The results of samples measurements showed that three contaminated area with 238U and 235U found in the study area. Two scenarios were applied for each contaminated area to estimate the dose using RESRAD (onsite) version 7.0 code. The total dose of resident farmer scenario for area A, B and C are 0.854, 0.033 and 2.15×10-3 mSv.yr-1, respectively. Whi
... Show MoreThe current research aims to train students to take benefit of their studies to analyze and taste the artistic works as one of the most important components of the academic structure for students specializing in visual arts; then to activate this during training them the methods of teaching. Consequently, the capabilities of mind maps were employed as a tool that would be through freeing each student to analyze a model of artistic work and think about his analytical principles according to what he knows. Then, a start-up with a new stage revolves around the possibility of transforming this analysis into a teaching style by thinking about how the student would do. The same person who undertook the technical analysis should offer this work
... Show MoreElectrochemical corrosion of hydroxyapatite (HAP) coated performance depends on various parameters like applied potential, time, thickness and sintering temperature. Thus, the optimum parameters required for the development of stable HAP coatings was found by using electrophoretic deposition (EPD) technique. This study discusses the results obtained from open circuit potential-time measurements (OCP-time), potentiodynamic polarisation and immersion tests for all alloy samples done under varying experimental conditions, so that the optimum coating parameters can be established. The ageing studies of the coated samples were carried out by immersing them in Ringer’s solution for a period of 30 days indicates the importance of stable HAP c
... Show More
