Brachytherapy treatment is primarily used for the certain handling kinds of cancerous tumors. Using radionuclides for the study of tumors has been studied for a very long time, but the introduction of mathematical models or radiobiological models has made treatment planning easy. Using mathematical models helps to compute the survival probabilities of irradiated tissues and cancer cells. With the expansion of using HDR-High dose rate Brachytherapy and LDR-low dose rate Brachytherapy for the treatment of cancer, it requires fractionated does treatment plan to irradiate the tumor. In this paper, authors have discussed dose calculation algorithms that are used in Brachytherapy treatment planning. Precise and less time-consuming calculations using 3D dose distribution for the patient is one of the important necessities in modern radiation oncology. For this it is required to have accurate algorithms which help in TPS. There are certain limitations with the algorithm which are used for calculating the dose. This work is done to evaluate the correctness of five algorithms that are presently employed for treatment planning, including pencil beam convolution (PBC), superposition (SP), anisotropic analytical algorithm (AAA), Monte Carlo (MC), Clarkson Method, Fast Fourier Transform, Convolution method. The algorithms used in radiotherapy treatment planning are categorized as correction‐based and model‐based.
<span>One of the main difficulties facing the certified documents documentary archiving system is checking the stamps system, but, that stamps may be contains complex background and surrounded by unwanted data. Therefore, the main objective of this paper is to isolate background and to remove noise that may be surrounded stamp. Our proposed method comprises of four phases, firstly, we apply k-means algorithm for clustering stamp image into a number of clusters and merged them using ISODATA algorithm. Secondly, we compute mean and standard deviation for each remaining cluster to isolate background cluster from stamp cluster. Thirdly, a region growing algorithm is applied to segment the image and then choosing the connected regi
... Show MoreThe follower of the needs of the users of accounting information notices the necessity of adopting faithful representation of information Hence, IASB adopted the economic substance approach as the basis for the Formulation and development of international accounting standards Therefore, this research discusses the reflection of the economic phenomenon in terms of its economic substance on the subject of measurement , And it should be consistent measurement method where the problem of research is concentrated in the fact that the economic units operating in the local environment to address events and economic phenomena according to the legal form , as appropriate With the requirements of the unified accounting sys
... Show MoreThe research took the spatial autoregressive model: SAR and spatial error model: SEM in an attempt to provide practical evidence that proves the importance of spatial analysis, with a particular focus on the importance of using regression models spatial and that includes all of the spatial dependence, which we can test its presence or not by using Moran test. While ignoring this dependency may lead to the loss of important information about the phenomenon under research is reflected in the end on the strength of the statistical estimation power, as these models are the link between the usual regression models with time-series models. The spatial analysis had been applied to Iraq Household Socio-Economic Survey: IHS
... Show MoreNowadays, cloud computing has attracted the attention of large companies due to its high potential, flexibility, and profitability in providing multi-sources of hardware and software to serve the connected users. Given the scale of modern data centers and the dynamic nature of their resource provisioning, we need effective scheduling techniques to manage these resources while satisfying both the cloud providers and cloud users goals. Task scheduling in cloud computing is considered as NP-hard problem which cannot be easily solved by classical optimization methods. Thus, both heuristic and meta-heuristic techniques have been utilized to provide optimal or near-optimal solutions within an acceptable time frame for such problems. In th
... Show MoreMixed-effects conditional logistic regression is evidently more effective in the study of qualitative differences in longitudinal pollution data as well as their implications on heterogeneous subgroups. This study seeks that conditional logistic regression is a robust evaluation method for environmental studies, thru the analysis of environment pollution as a function of oil production and environmental factors. Consequently, it has been established theoretically that the primary objective of model selection in this research is to identify the candidate model that is optimal for the conditional design. The candidate model should achieve generalizability, goodness-of-fit, parsimony and establish equilibrium between bias and variab
... Show MoreThe subject of dumping is considering today one of the subjects in which form an obstruction arise in front of the cycle of growth for some countries , such as the study of dumping is capturing a large attention by the competent because either a big role and effect in growing the economies of nations then the subject of dumping became a field turn around its sides many measures and laws … and may be done resorting to by many states of the world to anti-dumping as approach of determent weapon delimit the impact of dumping and gives the national agriculture sector the opportunity for rising and growing so this section of international economics is capturing a special importance and represent in same time an important
... Show MoreAccurate prediction of river water quality parameters is essential for environmental protection and sustainable agricultural resource management. This study presents a novel framework for estimating potential salinity in river water in arid and semi‐arid regions by integrating a kernel extreme learning machine (KELM) with a boosted salp swarm algorithm based on differential evolution (KELM‐BSSADE). A dataset of 336 samples, including bicarbonate, calcium, pH, total dissolved solids and sodium adsorption ratio, was collected from the Idenak station in Iran and was used for the modelling. Results demonstrated that KELM‐BSSADE outperformed models such as deep random vector funct