In low-latitude areas less than 10° in latitude angle, the solar radiation that goes into the solar still increases as the cover slope approaches the latitude angle. However, the amount of water that is condensed and then falls toward the solar-still basin is also increased in this case. Consequently, the solar yield still is significantly decreased, and the accuracy of the prediction method is affected. This reduction in the yield and the accuracy of the prediction method is inversely proportional to the time in which the condensed water stays on the inner side of the condensing cover without collection because more drops will fall down into the basin of the solar-still. Different numbers of scraper motions per hour (NSM), that is, 1, 2, 3, 4, 5, 6, and 7, are implemented to increase the hourly yield of solar still (HYSS) of the double-slope solar still hybrid with rubber scrapers (DSSSHS) in areas at low latitudes and develop an accurate model for forecasting the HYSS. The proposed model is developed by determining the best values of the constant factors that are associated with NSM, and the optimal values of exponent (n) and the unknown constant (C) for the Nusselt number expression (Nu). These variables are used in formulating the models for estimating HYSS. The particle swarm optimization (PSO) algorithm is used to solve the optimization problem, thereby determining the optimal yields. Water that condensed and accumulated inside the condensing glass cover of the DSSSHS is collected by increasing NSM. This process increases in the specific productivity of DSSSHS and the accuracy of the HYSS prediction model. Results show that the proposed model can consistently and accurately estimate HYSS. Based on the relative root mean square error (RRMSE), the proposed model PSO–HYSS attained a minimum value (2.81), whereas the validation models attained Dunkle’s (78.68) and Kumar and Tiwari’s (141.37).
Today with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned
Speech is the essential way to interact between humans or between human and machine. However, it is always contaminated with different types of environment noise. Therefore, speech enhancement algorithms (SEA) have appeared as a significant approach in speech processing filed to suppress background noise and return back the original speech signal. In this paper, a new efficient two-stage SEA with low distortion is proposed based on minimum mean square error sense. The estimation of clean signal is performed by taking the advantages of Laplacian speech and noise modeling based on orthogonal transform (Discrete Krawtchouk-Tchebichef transform) coefficients distribution. The Discrete Kra
The quality of Global Navigation Satellite Systems (GNSS) networks are considerably influenced by the configuration of the observed baselines. Where, this study aims to find an optimal configuration for GNSS baselines in terms of the number and distribution of baselines to improve the quality criteria of the GNSS networks. First order design problem (FOD) was applied in this research to optimize GNSS network baselines configuration, and based on sequential adjustment method to solve its objective functions.
FOD for optimum precision (FOD-p) was the proposed model which based on the design criteria of A-optimality and E-optimality. These design criteria were selected as objective functions of precision, whic
... Show MoreLeishmaniasis is one of the important parasitic diseases, affecting mainly low social class people indeveloping countries, and is more prevalent and endemic in the tropical and subtropical regions of old worldand new world. Despite ofbroad distribution in Iraq,little known about the geneticcharacteristics of thecausative agents. So this study was aimed to evaluate the genetic varietyoftwo IraqiLeishmaniatropicaisolatesbased on heat shock protein gene sequence 70 (HSP70) in comparison with universal isolates recordedsequences data. After amplification and sequencing of HSP70 gene,the obtainedresults were alignment alongwith homologous Leishmania sequences retrieved from NCBI by using BLAST. The analysis results showedpresence of particular g
... Show MoreBiometrics represent the most practical method for swiftly and reliably verifying and identifying individuals based on their unique biological traits. This study addresses the increasing demand for dependable biometric identification systems by introducing an efficient approach to automatically recognize ear patterns using Convolutional Neural Networks (CNNs). Despite the widespread adoption of facial recognition technologies, the distinct features and consistency inherent in ear patterns provide a compelling alternative for biometric applications. Employing CNNs in our research automates the identification process, enhancing accuracy and adaptability across various ear shapes and orientations. The ear, being visible and easily captured in
... Show MoreFacial emotion recognition finds many real applications in the daily life like human robot interaction, eLearning, healthcare, customer services etc. The task of facial emotion recognition is not easy due to the difficulty in determining the effective feature set that can recognize the emotion conveyed within the facial expression accurately. Graph mining techniques are exploited in this paper to solve facial emotion recognition problem. After determining positions of facial landmarks in face region, twelve different graphs are constructed using four facial components to serve as a source for sub-graphs mining stage using gSpan algorithm. In each group, the discriminative set of sub-graphs are selected and fed to Deep Belief Network (DBN) f
... Show MoreB3LYP/6-31G, DFT method was applied to hypothetical study the design of six carbon nanotube materials based on [8]circulene, through the use of cyclic polymerization of two and three molecules of [8]circulene. Optimized structures of [8]circulene have saddle-shaped. Design of six carbon nanotubes reactions were done by thermodynamically calculating (Δ S, Δ G and Δ H) and the stability of these hypothetical nanotubes depending on the value of HOMO energy level. Nanotubes obtained have the most efficient gap energy, making them potentially useful for solar cell applications.