Background: Hypothyroidism is the most abundant thyroid disorder worldwide. For decades, levothyroxine was the main effective pharmacological treatment for hypothyroidism. A variety of factors can influence levothyroxine dose, such as genetic variations. Studying the impact of genetic polymorphisms on the administration of medications was risen remarkably. Different genetic variations were investigated that might affect levothyroxine dose requirements, especially the deiodinase enzymes. Deiodinase type 2 genetic polymorphisms’ impact on levothyroxine dose was studied in different populations.
Objective: To examine the association of the two single nucleotide polymorphism (SNP)s of deiodinase t
... Show MoreThe convergence speed is the most important feature of Back-Propagation (BP) algorithm. A lot of improvements were proposed to this algorithm since its presentation, in order to speed up the convergence phase. In this paper, a new modified BP algorithm called Speeding up Back-Propagation Learning (SUBPL) algorithm is proposed and compared to the standard BP. Different data sets were implemented and experimented to verify the improvement in SUBPL.
Improving" Jackknife Instrumental Variable Estimation method" using A class of immun algorithm with practical application
Artificial Intelligence Algorithms have been used in recent years in many scientific fields. We suggest employing artificial TABU algorithm to find the best estimate of the semi-parametric regression function with measurement errors in the explanatory variables and the dependent variable, where measurement errors appear frequently in fields such as sport, chemistry, biological sciences, medicine, and epidemiological studies, rather than an exact measurement.
There are many methods of searching large amount of data to find one particular piece of information. Such as find name of person in record of mobile. Certain methods of organizing data make the search process more efficient the objective of these methods is to find the element with least cost (least time). Binary search algorithm is faster than sequential and other commonly used search algorithms. This research develops binary search algorithm by using new structure called Triple, structure in this structure data are represented as triple. It consists of three locations (1-Top, 2-Left, and 3-Right) Binary search algorithm divide the search interval in half, this process makes the maximum number of comparisons (Average case com
... Show More