Brachytherapy treatment is primarily used for the certain handling kinds of cancerous tumors. Using radionuclides for the study of tumors has been studied for a very long time, but the introduction of mathematical models or radiobiological models has made treatment planning easy. Using mathematical models helps to compute the survival probabilities of irradiated tissues and cancer cells. With the expansion of using HDR-High dose rate Brachytherapy and LDR-low dose rate Brachytherapy for the treatment of cancer, it requires fractionated does treatment plan to irradiate the tumor. In this paper, authors have discussed dose calculation algorithms that are used in Brachytherapy treatment planning. Precise and less time-consuming calculations using 3D dose distribution for the patient is one of the important necessities in modern radiation oncology. For this it is required to have accurate algorithms which help in TPS. There are certain limitations with the algorithm which are used for calculating the dose. This work is done to evaluate the correctness of five algorithms that are presently employed for treatment planning, including pencil beam convolution (PBC), superposition (SP), anisotropic analytical algorithm (AAA), Monte Carlo (MC), Clarkson Method, Fast Fourier Transform, Convolution method. The algorithms used in radiotherapy treatment planning are categorized as correction‐based and model‐based.
In high-dimensional semiparametric regression, balancing accuracy and interpretability often requires combining dimension reduction with variable selection. This study intro- duces two novel methods for dimension reduction in additive partial linear models: (i) minimum average variance estimation (MAVE) combined with the adaptive least abso- lute shrinkage and selection operator (MAVE-ALASSO) and (ii) MAVE with smoothly clipped absolute deviation (MAVE-SCAD). These methods leverage the flexibility of MAVE for sufficient dimension reduction while incorporating adaptive penalties to en- sure sparse and interpretable models. The performance of both methods is evaluated through simulations using the mean squared error and variable selection cri
... Show MoreResearch Summary:
Seeking happiness and searching for it have been among the priorities of mankind from the beginning of his creation and will remain so until the end of this world, and even in the next life, he seeks happiness, but the difference is that a person can work in this world to obtain it, but in the next life he is expected to get what he done in this world. And among these reasons are practical actions that a person undertakes while he intends to draw close to God Almighty, so they lead him to attain his desired perfection, and to attain his goals and objectives, which is the minimum happiness in this life, and ultimate happiness after the soul separates the body, and on the day of the judgment, Amon
... Show MoreThe DC electrical conductivity properties of Ge60Se40-xTex alloy with x = 0, 5, 10, 15 and 20). The samples were formed in the form of discs with the thickness of 0.25–0.30 cm and the diameter of 1.5 cm. Samples were pressed under a pressure of 6 tons per cm2 , using a ton hydraulic press. They were prepared after being pressed using a ton hydraulic press using a hydraulic press. Melting point technology use to preper the samples. Continuous electrical conductivity properties were recorded from room temperature to 475 K. Experimental data indicates that glass containing 15% Te has the highest electrical conductivity allowing maximum current through the sample compared to Lu with other samples. Therefore, it is found that the DC co
... Show MoreToday with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned
Document clustering is the process of organizing a particular electronic corpus of documents into subgroups of similar text features. Formerly, a number of conventional algorithms had been applied to perform document clustering. There are current endeavors to enhance clustering performance by employing evolutionary algorithms. Thus, such endeavors became an emerging topic gaining more attention in recent years. The aim of this paper is to present an up-to-date and self-contained review fully devoted to document clustering via evolutionary algorithms. It firstly provides a comprehensive inspection to the document clustering model revealing its various components with its related concepts. Then it shows and analyzes the principle research wor
... Show MoreGenetic algorithms (GA) are a helpful instrument for planning and controlling the activities of a project. It is based on the technique of survival of the fittest and natural selection. GA has been used in different sectors of construction and building however that is rarely documented. This research aimed to examine the utilisation of genetic algorithms in construction project management. For this purpose, the research focused on the benefits and challenges of genetic algorithms, and the extent to which genetic algorithms is utilised in construction project management. Results showed that GA provides an ability of generating near optimal solutions which can be adopted to reduce complexity in project management and resolve difficult problem
... Show MoreAbstract— The growing use of digital technologies across various sectors and daily activities has made handwriting recognition a popular research topic. Despite the continued relevance of handwriting, people still require the conversion of handwritten copies into digital versions that can be stored and shared digitally. Handwriting recognition involves the computer's strength to identify and understand legible handwriting input data from various sources, including document, photo-graphs and others. Handwriting recognition pose a complexity challenge due to the diversity in handwriting styles among different individuals especially in real time applications. In this paper, an automatic system was designed to handwriting recognition
... Show MoreEnhancing quality image fusion was proposed using new algorithms in auto-focus image fusion. The first algorithm is based on determining the standard deviation to combine two images. The second algorithm concentrates on the contrast at edge points and correlation method as the criteria parameter for the resulted image quality. This algorithm considers three blocks with different sizes at the homogenous region and moves it 10 pixels within the same homogenous region. These blocks examine the statistical properties of the block and decide automatically the next step. The resulted combined image is better in the contras
... Show More
The process of soil classification in Iraq for industrial purposes is important topics that need to be extensive and specialized studies. In order for the advancement of reality service and industrial in our dear country, that a lot of scientific research touched upon the soil classification in the agricultural, commercial and other fields. No source and research can be found that touched upon the classification of land for industrial purposes directly. In this research specialized programs have been used such as geographic information system software The geographical information system permits the study of local distribution of phenomena, activities and the aims that can be determined in the loca