In recent years, Wireless Sensor Networks (WSNs) are attracting more attention in many fields as they are extensively used in a wide range of applications, such as environment monitoring, the Internet of Things, industrial operation control, electric distribution, and the oil industry. One of the major concerns in these networks is the limited energy sources. Clustering and routing algorithms represent one of the critical issues that directly contribute to power consumption in WSNs. Therefore, optimization techniques and routing protocols for such networks have to be studied and developed. This paper focuses on the most recent studies and algorithms that handle energy-efficiency clustering and routing in WSNs. In addition, the prime issues in these networks are discussed and summarized using comparison tables, including the main features, limitations, and the kind of simulation toolbox. Energy efficiency is compared between some techniques and showed that according to clustering mode “Distributed” and CH distribution “Uniform”, HEED and EECS are best, while in the non-uniform clustering, both DDAR and THC are efficient. According to clustering mode “Centralized” and CH distribution “Uniform”, the LEACH-C protocol is more effective.
Abstract— The growing use of digital technologies across various sectors and daily activities has made handwriting recognition a popular research topic. Despite the continued relevance of handwriting, people still require the conversion of handwritten copies into digital versions that can be stored and shared digitally. Handwriting recognition involves the computer's strength to identify and understand legible handwriting input data from various sources, including document, photo-graphs and others. Handwriting recognition pose a complexity challenge due to the diversity in handwriting styles among different individuals especially in real time applications. In this paper, an automatic system was designed to handwriting recognition
... Show More<span>Dust is a common cause of health risks and also a cause of climate change, one of the most threatening problems to humans. In the recent decade, climate change in Iraq, typified by increased droughts and deserts, has generated numerous environmental issues. This study forecasts dust in five central Iraqi districts using machine learning and five regression algorithm supervised learning system framework. It was assessed using an Iraqi meteorological organization and seismology (IMOS) dataset. Simulation results show that the gradient boosting regressor (GBR) has a mean square error of 8.345 and a total accuracy ratio of 91.65%. Moreover, the results show that the decision tree (DT), where the mean square error is 8.965, c
... Show MoreThe purpose of this paper is to solve the stochastic demand for the unbalanced transport problem using heuristic algorithms to obtain the optimum solution, by minimizing the costs of transporting the gasoline product for the Oil Products Distribution Company of the Iraqi Ministry of Oil. The most important conclusions that were reached are the results prove the possibility of solving the random transportation problem when the demand is uncertain by the stochastic programming model. The most obvious finding to emerge from this work is that the genetic algorithm was able to address the problems of unbalanced transport, And the possibility of applying the model approved by the oil products distribution company in the Iraqi Ministry of Oil to m
... Show MoreToday with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned
Abstract
For sparse system identification,recent suggested algorithms are
-norm Least Mean Square (
-LMS), Zero-Attracting LMS (ZA-LMS), Reweighted Zero-Attracting LMS (RZA-LMS), and p-norm LMS (p-LMS) algorithms, that have modified the cost function of the conventional LMS algorithm by adding a constraint of coefficients sparsity. And so, the proposed algorithms are named
-ZA-LMS,
In this review of literature, the light will be concentrated on the role of stem cells as an approach in periodontal regeneration.
Ketoprofen has recently been proven to offer therapeutic potential in preventing cancers such as colorectal and lung tumors, as well as in treating neurological illnesses. The goal of this review is to show the methods that have been used for determining ketoprofen in pharmaceutical formulations. Precision product quality control is crucial to confirm the composition of the drugs in pharmaceutical use. Several analytical techniques, including chromatographic and spectroscopic methods, have been used for determining ketoprofen in different sample forms such as a tablet, capsule, ampoule, gel, and human plasma. The limit of detection of ketoprofen was 0.1 ng/ ml using liquid chromatography with tandem mass spectrometry, while it was 0
... Show MoreSecured multimedia data has grown in importance over the last few decades to safeguard multimedia content from unwanted users. Generally speaking, a number of methods have been employed to hide important visual data from eavesdroppers, one of which is chaotic encryption. This review article will examine chaotic encryption methods currently in use, highlighting their benefits and drawbacks in terms of their applicability for picture security.
The futuristic age requires progress in handwork or even sub-machine dependency and Brain-Computer Interface (BCI) provides the necessary BCI procession. As the article suggests, it is a pathway between the signals created by a human brain thinking and the computer, which can translate the signal transmitted into action. BCI-processed brain activity is typically measured using EEG. Throughout this article, further intend to provide an available and up-to-date review of EEG-based BCI, concentrating on its technical aspects. In specific, we present several essential neuroscience backgrounds that describe well how to build an EEG-based BCI, including evaluating which signal processing, software, and hardware techniques to use. Individu
... Show More