Brachytherapy treatment is primarily used for the certain handling kinds of cancerous tumors. Using radionuclides for the study of tumors has been studied for a very long time, but the introduction of mathematical models or radiobiological models has made treatment planning easy. Using mathematical models helps to compute the survival probabilities of irradiated tissues and cancer cells. With the expansion of using HDR-High dose rate Brachytherapy and LDR-low dose rate Brachytherapy for the treatment of cancer, it requires fractionated does treatment plan to irradiate the tumor. In this paper, authors have discussed dose calculation algorithms that are used in Brachytherapy treatment planning. Precise and less time-consuming calculations using 3D dose distribution for the patient is one of the important necessities in modern radiation oncology. For this it is required to have accurate algorithms which help in TPS. There are certain limitations with the algorithm which are used for calculating the dose. This work is done to evaluate the correctness of five algorithms that are presently employed for treatment planning, including pencil beam convolution (PBC), superposition (SP), anisotropic analytical algorithm (AAA), Monte Carlo (MC), Clarkson Method, Fast Fourier Transform, Convolution method. The algorithms used in radiotherapy treatment planning are categorized as correction‐based and model‐based.
Background: The majorities of statin-treated patients, in whom low-density lipoprotein cholesterol (LDL-C) targets have been achieved, have had recurrent cardiovascular events (CVE) with an absolute rate remain even higher among patients with disorders of insulin resistance, metabolic syndrome (MetS) and type2 diabetes mellitus (T2DM) as compared to patients devoid of these conditions.Objectives: Provide updated key messages of lipid and lipoprotein abnormalities as indicator for cardiovascular disease (CVD) risk in patients with T2DM and obesity, as well as the current evidence-based treatment targets and interventions to reduce this risk.Key messages: The Residual Risk Reduction Initiative (R3I) emphasized atherogenic dyslipidemia (AD)
... Show MoreIn this paper, split-plate airlift electrochemical reactor as an apparatus with new configuration for wastewater treatment was provided. Two aluminum plates were fixed inside the reactor and present two functions; first it works as split plates for internal loop generation of the airlift system (the zone between the two plates acts as riser while the other two zones act as downcomer) and second it works as two electrodes for electrocoagulation process. Simulated wastewater contaminated with zinc ions was used to test the performance of this apparatus for zinc removal by studying the effect of different experimental variables such as initial concentration of zinc (50-800 ppm), electrical current density (2.67-21.4 mA/cm2), init
... Show MoreMyrtle plant was washed, dried, and powdered after harvesting to produce a fine powder used in water treatment. An alcoholic extract was created from the myrtle plant using ethanol, which was then analyzed using GC-Mass, Fourier Transform Infrared spectroscopy, and ultraviolet-visible spectroscopy to identify the active components. Zinc nanoparticles were created using alcoholic extract. FT-IR, UV-Vis, SEM, EDX, and TEM were used to characterize zinc nanoparticles. Using a continuous processing procedure, zinc nanoparticles with myrtle extract and powder were employed to clean polluted water containing pesticides and antibiotic. First, 2 g of zinc nanoparticles was mixed with 20 ml of polluted water and the result was (Tetra 44%, Levo 32%),
... Show MoreGrey system theory is a multidisciplinary scientific approach, which deals with systems that have partially unknown information (small sample and uncertain information). Grey modeling as an important component of such theory gives successful results with limited amount of data. Grey Models are divided into two types; univariate and multivariate grey models. The univariate grey model with one order derivative equation GM (1,1) is the base stone of the theory, it is considered the time series prediction model but it doesn’t take the relative factors in account. The traditional multivariate grey models GM(1,M) takes those factor in account but it has a complex structure and some defects in " modeling mechanism", "parameter estimation "and "m
... Show MoreThe current study examined the effect of different sample sizes to detect the Item differential functioning (DIF). The study has used three different sizes of the samples (300, 500, 1000), as well as to test a component of twenty polytomous items, where each item has five categories. They were used Graded Response Model as a single polytomous item response theory model to estimate items and individuals’ parameters. The study has used the Mantel-Haenszel (MH) way to detect (DIF) through each case for the different samples. The results of the study showed the inverse relationship between the sample size and the number of items, which showed a differential performer.
This paper presents a hybrid approach for solving null values problem; it hybridizes rough set theory with intelligent swarm algorithm. The proposed approach is a supervised learning model. A large set of complete data called learning data is used to find the decision rule sets that then have been used in solving the incomplete data problem. The intelligent swarm algorithm is used for feature selection which represents bees algorithm as heuristic search algorithm combined with rough set theory as evaluation function. Also another feature selection algorithm called ID3 is presented, it works as statistical algorithm instead of intelligent algorithm. A comparison between those two approaches is made in their performance for null values estima
... Show MoreMetaheuristics under the swarm intelligence (SI) class have proven to be efficient and have become popular methods for solving different optimization problems. Based on the usage of memory, metaheuristics can be classified into algorithms with memory and without memory (memory-less). The absence of memory in some metaheuristics will lead to the loss of the information gained in previous iterations. The metaheuristics tend to divert from promising areas of solutions search spaces which will lead to non-optimal solutions. This paper aims to review memory usage and its effect on the performance of the main SI-based metaheuristics. Investigation has been performed on SI metaheuristics, memory usage and memory-less metaheuristics, memory char
... Show MoreThe optimization of artificial gas lift techniques plays a crucial role in the advancement of oil field development. This study focuses on investigating the impact of gas lift design and optimization on production outcomes within the Mishrif formation of the Halfaya oil field. A comprehensive production network nodal analysis model was formulated using a PIPESIM Optimizer-based Genetic Algorithm and meticulously calibrated utilizing field-collected data from a network comprising seven wells. This well group encompasses three directional wells currently employing gas lift and four naturally producing vertical wells. To augment productivity and optimize network performance, a novel gas lift design strategy was proposed. The optimization of
... Show MoreMaximum power point tracking (MPPT) is used in photovoltaic (PV) systems to enhance efficiency and maximize the output power of PV module, regardless the variation of temperature, irradiation, and the electrical characteristics of the load. A new MPPT system has been presented in this research, consisting of a synchronous DC-DC step-down Buck converter controlled by an Arduino microcontroller based unit. The MPPT process with Perturb and Observe method is performed with a DC-DC converter circuit to overcome the problem of voltage mismatch between the PV modules and the loads. The proposing system has high efficiency, lower cost and can be easily modified to handle more energy sources. The test results indicate that the u
... Show MoreSoftware testing is a vital part of the software development life cycle. In many cases, the system under test has more than one input making the testing efforts for every exhaustive combination impossible (i.e. the time of execution of the test case can be outrageously long). Combinatorial testing offers an alternative to exhaustive testing via considering the interaction of input values for every t-way combination between parameters. Combinatorial testing can be divided into three types which are uniform strength interaction, variable strength interaction and input-output based relation (IOR). IOR combinatorial testing only tests for the important combinations selected by the tester. Most of the researches in combinatorial testing appli
... Show More