Preferred Language
Articles
/
2hatUocBVTCNdQwCDkSU
Community detection model for dynamic networks based on hidden Markov model and evolutionary algorithm
...Show More Authors

Finding communities of connected individuals in complex networks is challenging, yet crucial for understanding different real-world societies and their interactions. Recently attention has turned to discover the dynamics of such communities. However, detecting accurate community structures that evolve over time adds additional challenges. Almost all the state-of-the-art algorithms are designed based on seemingly the same principle while treating the problem as a coupled optimization model to simultaneously identify community structures and their evolution over time. Unlike all these studies, the current work aims to individually consider this three measures, i.e. intra-community score, inter-community score, and evolution of community over time. Here, we adopt a new perspective towards detecting the evolution of community structures. The proposed method realizes the decomposition of the problem into three essential components; searching in: intra-community connections, inter-community connections, and community evolution. A multi-objective optimization problem is defined to account for the different intra and inter community structures. Further, we formulate the community evolution problem as a Hidden Markov Model in an attempt to dexterously track the most likely sequence of communities. Then the new model, called Hidden Markov Model-based Multi-Objective evolutionary algorithm for Dynamic Community Detection (HMM-MODCD), uses a multi-objective evolutionary algorithm and Viterbi algorithm for formulating objective functions and providing temporal smoothness over time for clustering dynamic networks. The performance of the proposed algorithm is evaluated on synthetic and real-world dynamic networks and compared against several state-of-the-art algorithms. The results clearly demonstrate the effectiveness of the proposed algorithm to outperform other algorithms.

Scopus Clarivate Crossref
View Publication
Publication Date
Mon Dec 24 2018
Journal Name
Civil Engineering Journal
Artificial Neural Network Model for the Prediction of Groundwater Quality
...Show More Authors

The present article delves into the examination of groundwater quality, based on WQI, for drinking purposes in Baghdad City. Further, for carrying out the investigation, the data was collected from the Ministry of Water Resources of Baghdad, which represents water samples drawn from 114 wells in Al-Karkh and Al-Rusafa sides of Baghdad city. With the aim of further determining WQI, four water parameters such as (i) pH, (ii) Chloride (Cl), (iii) Sulfate (SO4), and (iv) Total dissolved solids (TDS), were taken into consideration. According to the computed WQI, the distribution of the groundwater samples, with respect to their quality classes such as excellent, good, poor, very poor and unfit for human drinking purpose, was found to be

... Show More
View Publication
Crossref (30)
Clarivate Crossref
Publication Date
Sun Jun 01 2014
Journal Name
2014 International Conference On Computer And Information Sciences (iccoins)
Proposed conceptual model for E-service quality in Malaysian universities
...Show More Authors

View Publication
Scopus (3)
Crossref (2)
Scopus Crossref
Publication Date
Thu Jun 01 2023
Journal Name
International Journal Of Electrical And Computer Engineering (ijece)
An optimized deep learning model for optical character recognition applications
...Show More Authors

The convolutional neural networks (CNN) are among the most utilized neural networks in various applications, including deep learning. In recent years, the continuing extension of CNN into increasingly complicated domains has made its training process more difficult. Thus, researchers adopted optimized hybrid algorithms to address this problem. In this work, a novel chaotic black hole algorithm-based approach was created for the training of CNN to optimize its performance via avoidance of entrapment in the local minima. The logistic chaotic map was used to initialize the population instead of using the uniform distribution. The proposed training algorithm was developed based on a specific benchmark problem for optical character recog

... Show More
View Publication
Scopus (2)
Scopus Crossref
Publication Date
Fri Jan 01 2021
Journal Name
Annals Of Pure And Applied Mathematics
Linear Regression Model Using Bayesian Approach for Iraqi Unemployment Rate
...Show More Authors

In this paper we used frequentist and Bayesian approaches for the linear regression model to predict future observations for unemployment rates in Iraq. Parameters are estimated using the ordinary least squares method and for the Bayesian approach using the Markov Chain Monte Carlo (MCMC) method. Calculations are done using the R program. The analysis showed that the linear regression model using the Bayesian approach is better and can be used as an alternative to the frequentist approach. Two criteria, the root mean square error (RMSE) and the median absolute deviation (MAD) were used to compare the performance of the estimates. The results obtained showed that the unemployment rates will continue to increase in the next two decade

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Tue May 16 2023
Journal Name
Journal Of Engineering
Statistical Model for Predicting the Optimum Gypsum Content in Concrete
...Show More Authors

The problem of internal sulfate attack in concrete is widespread in Iraq and neighboring countries.This is because of the high sulfate content usually present in sand and gravel used in it. In the present study the total effective sulfate in concrete was used to calculate the optimum SO3 content. Regression models were developed based on linear regression analysis to predict the optimum SO3 content usually referred as (O.G.C) in concrete. The data is separated to 155 for the development of the models and 37 for checking the models. Eight models were built for 28-days age. Then a late age (greater than 28-days) model was developed based on the predicted optimum SO3 content of 28-days and late age. Eight developed models were built for all

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Jun 30 2002
Journal Name
Iraqi Journal Of Chemical And Petroleum Engineering
A Phase Behavior Compositional Model for Jambour Cretaceous Oil Reservoir
...Show More Authors

View Publication Preview PDF
Publication Date
Tue Nov 01 2022
Journal Name
Journal Of Engineering
Artificial Neural Network Model for Wastewater Projects Maintenance Management Plan
...Show More Authors

Wastewater projects are one of the most important infrastructure projects, which require developing strategic plans to manage these projects. Most of the wastewater projects in Iraq don’t have a maintenance plan. This research aims to prepare the maintenance management plan (MMP) for wastewater projects. The objective of the research is to predict the cost and time of maintenance projects by building a model using ANN. The research sample included (15) completed projects in Wasit Governorate, where the researcher was able to obtain the data of these projects through the historical information of the Wasit Sewage Directorate. In this research artificial neural networks (ANN) technique was used to build two models (cost

... Show More
View Publication Preview PDF
Crossref (2)
Crossref
Publication Date
Mon Oct 23 2023
Journal Name
Journal Of Optics
Single mode optical fiber sensor based on surface plasmon resonance for the detection of the oil aging for the electrical transformers
...Show More Authors

This work presents a novel technique for the detection of oil aging in electrical transformers using a single mode optical fiber sensor based on surface plasmon resonance (SPR). The aging of insulating oil is a critical issue in the maintenance and performance of electrical transformers, as it can lead to reduce insulation properties, increase risk of electrical breakdown, and decrease operational lifespan. Many parameters are calculated in this study in order to examine the efficiency of this sensor like sensitivity (S), signal to noise ratio (SNR), resolution (refractive index unit) and figure of merit (FOM) and the values are for figure of merit is 11.05, the signal to noise ratio is 20.3, the sensitivity is 6.63, and the resolution is 3

... Show More
View Publication Preview PDF
Scopus (3)
Crossref (3)
Scopus Clarivate Crossref
Publication Date
Wed Mar 10 2021
Journal Name
Baghdad Science Journal
A Comparison Between the Theoretical Cross Section Based on the Partial Level Density Formulae Calculated by the Exciton Model with the Experimental Data for (_79^197)Au nucleus
...Show More Authors

In this paper, the theoretical cross section in pre-equilibrium nuclear reaction has been studied for the reaction  at energy 22.4 MeV. Ericson’s formula of partial level density PLD and their corrections (William’s correction and spin correction) have been substituted  in the theoretical cross section and compared with the experimental data for  nucleus. It has been found that the theoretical cross section with one-component PLD from Ericson’s formula when  doesn’t agree with the experimental value and when . There is little agreement only at the high value of energy range with  the experimental cross section. The theoretical cross section that depends on the one-component William's formula and on-component corrected to spi

... Show More
View Publication Preview PDF
Scopus (9)
Crossref (5)
Scopus Clarivate Crossref
Publication Date
Wed May 04 2022
Journal Name
Int. J. Nonlinear Anal. Appl.
Knee Meniscus Segmentation and Tear Detection Based On Magnitic Resonacis Images: A Review of Literature
...Show More Authors

The meniscus has a crucial function in human anatomy, and Magnetic Resonance Imaging (M.R.I.) plays an essential role in meniscus assessment. It is difficult to identify cartilage lesions using typical image processing approaches because the M.R.I. data is so diverse. An M.R.I. data sequence comprises numerous images, and the attributes area we are searching for may differ from each image in the series. Therefore, feature extraction gets more complicated, hence specifically, traditional image processing becomes very complex. In traditional image processing, a human tells a computer what should be there, but a deep learning (D.L.) algorithm extracts the features of what is already there automatically. The surface changes become valuable when

... Show More