The region-based association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease. Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls. To tackle the problem of the sparse distribution, a two-stage approach was proposed in literature: In the first stage, haplotypes are computationally inferred from genotypes, followed by a haplotype coclassification. In the second stage, the association analysis is performed on the inferred haplotype groups. If a haplotype is unevenly distributed between the case and control samples, this haplotype is labeled as a risk haplotype. Unfortunately, the in-silico reconstruction of haplotypes might produce a proportion of false haplotypes which hamper the detection of rare but true haplotypes. Here, to address the issue, we propose an alternative approach: In Stage 1, we cluster genotypes instead of inferred haplotypes and estimate the risk genotypes based on a finite mixture model. In Stage 2, we infer risk haplotypes from risk genotypes inferred from the previous stage. To estimate the finite mixture model, we propose an EM algorithm with a novel data partition-based initialization. The performance of the proposed procedure is assessed by simulation studies and a real data analysis. Compared to the existing multiple Z-test procedure, we find that the power of genome-wide association studies can be increased by using the proposed procedure.
Cancer is one of the dangerous diseases that afflict a person through injury to cells and tissues in the body, where a person is vulnerable to infection in any age group, and it is not easy to control and multiply between cells and spread to the body. In spite of the great progress in medical studies interested in this aspect, the options for those with this disease are few and difficult, as they require significant financial costs for health services and for treatment that is difficult to provide.
This study dealt with the determinants of liver cancer by relying on the data of cancerous tumours taken from the Iraqi Center for Oncology in the Ministry of Health 2017. Survival analysis has been used as a m
... Show MoreThe effect of high energy radiation on the energy gap of compound semiconductor Silicon Carbide (SiC) are viewed. Emphasis is placed on those effects which can be interpreted in terms of energy levels. The goal is to develop semiconductors operating at high temperature with low energy gaps by induced permanent damage in SiC irradiated by gamma source. TEACO2 laser used for producing SiC thin films. Spectrophotometer lambda - UV, Visible instrument is used to determine energy gap (Eg). Co-60, Cs-137, and Sr-90 are used to irradiate SiC samples for different time of irradiation. Possible interpretation of the changing in Eg values as the time of irradiation change is discussed
In this study, pure Co3O4 nano structure and doping with 4 %, and
6 % of Yttrium is successfully synthesized by hydrothermal method.
The XRD examination, optical, electrical and photo sensing
properties have been studied for pure and doped Co3O4 thin films.
The X-ray diffraction (XRD) analysis shows that all films are
polycrystalline in nature, having cubic structure.
The optical properties indication that the optical energy gap follows
allowed direct electronic transition calculated using Tauc equation
and it increases for doped Co3O4. The photo sensing properties of
thin films are studied as a function of time at different wavelengths to
find the sensitivity for these lights.
High photo sensitivity dope
Big data of different types, such as texts and images, are rapidly generated from the internet and other applications. Dealing with this data using traditional methods is not practical since it is available in various sizes, types, and processing speed requirements. Therefore, data analytics has become an important tool because only meaningful information is analyzed and extracted, which makes it essential for big data applications to analyze and extract useful information. This paper presents several innovative methods that use data analytics techniques to improve the analysis process and data management. Furthermore, this paper discusses how the revolution of data analytics based on artificial intelligence algorithms might provide
... Show MoreData scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for
Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreAs a result of rapid industrialization and population development, toxic chemicals have been introduced into water systems in recent decades. Because of its excellent efficiency and simple design, the three-dimensional (3D) electro-Fenton method has been used for the treatment of wastewater. The goal of the current study is to explore the efficiency of phenol removal by the 3D electro-Fenton process, which is one of the advanced oxidation processes (AOPs). In the present work, the effect of the addition of granular activated carbon (GAC) particles to the electro-Fenton system as the third electrode would be investigated in the presence of graphite as the anode and nickel foam as the cathode, which is the source of electro-generated hydrogen
... Show MoreThe present study employed the NAG-4SX3-3D analyzer to precisely measure the energy response of the sensor. The goal was to enhance the understanding of this technology by providing expert information about the device. This technology offers an economical, quick, accurate, and sensitive approach. By utilizing the turbidity method, Cyproheptadine hydrochloride (CPH) was quantified in pharmaceutical samples without the need for additional substances. CPH is expected to undergo a direct reaction with calcium hexacyanoferrate, resulting in the formation of white precipitates. The linear range for CPH measurement falls within the range of (0.008–30) mM. The relative standard deviation (RSD) for six repetitions at concentrations of (6 and
... Show More