The influx of data in bioinformatics is primarily in the form of DNA, RNA, and protein sequences. This condition places a significant burden on scientists and computers. Some genomics studies depend on clustering techniques to group similarly expressed genes into one cluster. Clustering is a type of unsupervised learning that can be used to divide unknown cluster data into clusters. The k-means and fuzzy c-means (FCM) algorithms are examples of algorithms that can be used for clustering. Consequently, clustering is a common approach that divides an input space into several homogeneous zones; it can be achieved using a variety of algorithms. This study used three models to cluster a brain tumor dataset. The first model uses FCM, which is used to cluster genes. FCM allows an object to belong to two or more clusters with a membership grade between zero and one and the sum of belonging to all clusters of each gene is equal to one. This paradigm is useful when dealing with microarray data. The total time required to implement the first model is 22.2589 s. The second model combines FCM and particle swarm optimization (PSO) to obtain better results. The hybrid algorithm, i.e., FCM–PSO, uses the DB index as objective function. The experimental results show that the proposed hybrid FCM–PSO method is effective. The total time of implementation of this model is 89.6087 s. The third model combines FCM with a genetic algorithm (GA) to obtain better results. This hybrid algorithm also uses the DB index as objective function. The experimental results show that the proposed hybrid FCM–GA method is effective. Its total time of implementation is 50.8021 s. In addition, this study uses cluster validity indexes to determine the best partitioning for the underlying data. Internal validity indexes include the Jaccard, Davies Bouldin, Dunn, Xie–Beni, and silhouette. Meanwhile, external validity indexes include Minkowski, adjusted Rand, and percentage of correctly categorized pairings. Experiments conducted on brain tumor gene expression data demonstrate that the techniques used in this study outperform traditional models in terms of stability and biological significance.
This study investigates the impact of spatial resolution enhancement on supervised classification accuracy using Landsat 9 satellite imagery, achieved through pan-sharpening techniques leveraging Sentinel-2 data. Various methods were employed to synthesize a panchromatic (PAN) band from Sentinel-2 data, including dimension reduction algorithms and weighted averages based on correlation coefficients and standard deviation. Three pan-sharpening algorithms (Gram-Schmidt, Principal Components Analysis, Nearest Neighbour Diffusion) were employed, and their efficacy was assessed using seven fidelity criteria. Classification tasks were performed utilizing Support Vector Machine and Maximum Likelihood algorithms. Results reveal that specifi
... Show MoreResearch summarized in applying the model of fuzzy goal programming for aggregate production planning , in General Company for hydraulic industries / plastic factory to get an optimal production plan trying to cope with the impact that fluctuations in demand and employs all available resources using two strategies where they are available inventories strategy and the strategy of change in the level of the workforce, these strategies costs are usually imprecise/fuzzy. The plant administration trying to minimize total production costs, minimize carrying costs and minimize changes in labour levels. depending on the gained data from th
... Show MoreThis work presents the use of laser diode in the fiber distributed data interface FDDI networks. FDDI uses optical fiber as a transmission media. This solves the problems resulted from the EMI, and noise. In addition it increases the security of transmission. A network with a ring topology consists of three computers was designed and implemented. The timed token protocol was used to achieve and control the process of communication over the ring. Nonreturn to zero inversion (NRZI) modulation was carried out as a part of the physical (PHY) sublayer. The optical system consists of a laser diode with wavelength of 820 nm and 2.5 mW maximum output power as a source, optical fiber as a channel, and positive intrinsic negative (PIN) photodiode
... Show MoreHartha Formation is an overburdened horizon in the X-oilfield which generates a lot of Non-Productive Time (NPT) associated with drilling mud losses. This study has been conducted to investigate the loss events in this formation as well as to provide geological interpretations based on datasets from nine wells in this field of interest. The interpretation was based on different analyses including wireline logs, cuttings descriptions, image logs, and analog data. Seismic and coherency data were also used to formulate the geological interpretations and calibrate that with the loss events of the Hartha Fm.
The results revealed that the upper part of the Hartha Fm. was identified as an interval capable of creating potentia
... Show MoreIn this paper, new method have been investigated using evolving algorithms (EA's) to cryptanalysis one of the nonlinear stream cipher cryptosystems which depends on the Linear Feedback Shift Register (LFSR) unit by using cipher text-only attack. Genetic Algorithm (GA) and Ant Colony Optimization (ACO) which are used for attacking one of the nonlinear cryptosystems called "shrinking generator" using different lengths of cipher text and different lengths of combined LFSRs. GA and ACO proved their good performance in finding the initial values of the combined LFSRs. This work can be considered as a warning for a stream cipher designer to avoid the weak points, which may be f
... Show MoreIn this paper, we used four classification methods to classify objects and compareamong these methods, these are K Nearest Neighbor's (KNN), Stochastic Gradient Descentlearning (SGD), Logistic Regression Algorithm(LR), and Multi-Layer Perceptron (MLP). Weused MCOCO dataset for classification and detection the objects, these dataset image wererandomly divided into training and testing datasets at a ratio of 7:3, respectively. In randomlyselect training and testing dataset images, converted the color images to the gray level, thenenhancement these gray images using the histogram equalization method, resize (20 x 20) fordataset image. Principal component analysis (PCA) was used for feature extraction, andfinally apply four classification metho
... Show MoreBackground: Diabetic mellitus (DM) is a collection of metabolic disorder identified by hyperglycemia. The heterogeneous etiology includes defects either in insulin secretion, or in insulin action, or the both. In addition to the distraction in carbohydrate, fat and protein metabolism. Inflammatory reaction that caused by many pro-inflammatory cytokines play a central role in the pathogenicity of T2DM, these cytokines can enhance insulin resistance which led to impaired glucose homeostasis. Subjects: The study included 75 patients (38 males and 37 females) suffering from T2DM with age mean ± SE 52.30 ± 1.60, and 70 individuals as healthy controls (35 males and 35 females) with age mean ± SE 48.88 ± 0.64. Evaluation of immunological marke
... Show More