A skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.
Thyroid hemiagenesis (THA) is a rare congenital anomaly in which one lobe of thyroid gland fails to develop during embryological stage. Agenesis may be unilateral, total or isthmic. Left thyroid lobe is more commonly involved than right lobe in hemiagenesis. Agenesis of the isthmus was seen in 50% of cases. Left sided hemiagenesis is more common than right sided hemiagenesis with a Left to right ratio of 4:1. Clinically patients can be euthyroid, hypothyroid or hyperthyroid. Often it is diagnosed as an incidental finding during ultrasonography (USG) study of neck, which can easily diagnose this condition.
Actual incidence of THA is unknown; most cases are diagnosed in patients admitted for thyroid scan or thyroid surgery because
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
The development of information systems in recent years has contributed to various methods of gathering information to evaluate IS performance. The most common approach used to collect information is called the survey system. This method, however, suffers one major drawback. The decision makers consume considerable time to transform data from survey sheets to analytical programs. As such, this paper proposes a method called ‘survey algorithm based on R programming language’ or SABR, for data transformation from the survey sheets inside R environments by treating the arrangement of data as a relational format. R and Relational data format provide excellent opportunity to manage and analyse the accumulated data. Moreover, a survey syste
... Show MoreWireless sensor applications are susceptible to energy constraints. Most of the energy is consumed in communication between wireless nodes. Clustering and data aggregation are the two widely used strategies for reducing energy usage and increasing the lifetime of wireless sensor networks. In target tracking applications, large amount of redundant data is produced regularly. Hence, deployment of effective data aggregation schemes is vital to eliminate data redundancy. This work aims to conduct a comparative study of various research approaches that employ clustering techniques for efficiently aggregating data in target tracking applications as selection of an appropriate clustering algorithm may reflect positive results in the data aggregati
... Show MoreThe data preprocessing step is an important step in web usage mining because of the nature of log data, which are heterogeneous, unstructured, and noisy. Given the scalability and efficiency of algorithms in pattern discovery, a preprocessing step must be applied. In this study, the sequential methodologies utilized in the preprocessing of data from web server logs, with an emphasis on sub-phases, such as session identification, user identification, and data cleansing, are comprehensively evaluated and meticulously examined.
The summery of my research marked by the judges judgment with his knowledge, I dealt with the definition of the judiciary, linguistically and idiomatically, and the importance and ligitimcy of the judiciary, as the judiciary is one of the most importunt pillars of lslam,inwhich justice as it proves the truth to its owners.
The investment environment is the incubator for all types of domestic and foreign investments, so if their determinants are encouraging, they increase the levels of investment flows and vice versa, as there is a relationship between the nature of the investment environment and the level of investment flows, and the determinants of the investment environment are numerous and the most important of which are security and political stability, and economic and financial factors that include relative stability In the exchange rate and inflation rates, the availability of banks and their development, transparency and integrity in administrative dealings and the lack of prevalence of administrative and financial corruption, and the clari
... Show MoreThe Sonic Scanner is a multifunctional instrument designed to log wells, assess elastic characteristics, and support reservoir characterisation. Furthermore, it facilitates comprehension of rock mechanics, gas detection, and well positioning, while also furnishing data for geomechanical computations and sand management. The present work involved the application of the Sonic Scanner for both basic and advanced processing of oil-well-penetrating carbonate media. The study aimed to characterize the compressional, shear, Stoneley slowness, rock mechanical properties, and Shear anisotropy analysis of the formation. Except for intervals where significant washouts are encountered, the data quality of the Monopole, Dipole, and Stoneley modes is gen
... Show MoreThe current research discusses the topic of the formal data within the methodological framework through defining the research problem, limits and objectives and defining the most important terms mentioned in this research. The theoretical framework in the first section addressed (the concept of the Bauhaus school, the philosophy of the Bauhaus school and the logical bases of this school). The second section dealt with (the most important elements and structural bases of the Bauhaus school) which are considered the most important formal data of this school and their implications on the fabrics and costumes design. The research came up with the most important indicators resulting from the theoretical framework.
Chapter three defined the