The complexity and variety of language included in policy and academic documents make the automatic classification of research papers based on the United Nations Sustainable Development Goals (SDGs) somewhat difficult. Using both pre-trained and contextual word embeddings to increase semantic understanding, this study presents a complete deep learning pipeline combining Bidirectional Long Short-Term Memory (BiLSTM) and Convolutional Neural Network (CNN) architectures which aims primarily to improve the comprehensibility and accuracy of SDG text classification, thereby enabling more effective policy monitoring and research evaluation. Successful document representation via Global Vector (GloVe), Bidirectional Encoder Representations from Transformers (BERT), and FastText embeddings follows our approach, which comprises exhaustive preprocessing operations including stemming, stopword deletion, and ways to address class imbalance. Training and evaluation of the hybrid BiLSTM-CNN model on several benchmark datasets, including SDG-labeled corpora and relevant external datasets like GoEmotion and Ohsumed, help provide a complete assessment of the model’s generalizability. Moreover, this study utilizes zero-shot prompt-based categorization using GPT-3.5/4 and Flan-T5, thereby providing a comprehensive benchmark against current approaches and doing comparative tests using leading models such as Robustly Optimized BERT Pretraining Approach (RoBERTa) and Decoding-enhanced BERT with Disentangled Attention (DeBERTa). Experimental results show that the proposed hybrid model achieves competitive performance due to contextual embeddings, which greatly improve classification accuracy. The study explains model decision processes and improves openness using interpretability techniques, including SHapley Additive exPlanations (SHAP) analysis and attention visualization. These results emphasize the need to incorporate rapid engineering techniques alongside deep learning architectures for effective and interpretable SDG text categorization. With possible effects on more general uses in policy analysis and scientific literature mining, this work offers a scalable and transparent solution for automating the evaluation of SDG research.
Background: Neural tube defects (NTDs) are said to be inherited in a multifactorial fashion, i.e. genetic-environmental interaction. Maternal nutritional deficiencies had long been reported to cause NTDs, especially folate deficiency during early pregnancy. More attention had been paid to the exact mechanism by which this deficiency state causes these defects in the developing embryo. The most significant of all researches was that connecting reduced folate and increased homocysteine level in maternal serum on one hand and the risk of developing a NTD baby on the other hand. Objectives : to determine the significance of homocysteine level in Iraqi mothers who gave birth to babies with NTDs as compared to normal controls. Patients, Materials
... Show MoreSpeech recognition is a very important field that can be used in many applications such as controlling to protect area, banking, transaction over telephone network database access service, voice email, investigations, House controlling and management ... etc. Speech recognition systems can be used in two modes: to identify a particular person or to verify a person’s claimed identity. The family speaker recognition is a modern field in the speaker recognition. Many family speakers have similarity in the characteristics and hard to identify between them. Today, the scope of speech recognition is limited to speech collected from cooperative users in real world office environments and without adverse microphone or channel impairments.
Survival analysis is widely applied to data that described by the length of time until the occurrence of an event under interest such as death or other important events. The purpose of this paper is to use the dynamic methodology which provides a flexible method, especially in the analysis of discrete survival time, to estimate the effect of covariate variables through time in the survival analysis on dialysis patients with kidney failure until death occurs. Where the estimations process is completely based on the Bayes approach by using two estimation methods: the maximum A Posterior (MAP) involved with Iteratively Weighted Kalman Filter Smoothing (IWKFS) and in combination with the Expectation Maximization (EM) algorithm. While the other
... Show MoreThe water supply network inside the building is of high importance due to direct contact with the user that must be optimally designed to meet the water needs of users. This work aims to review previous research and scientific theories that deal with the design of water networks inside buildings, from calculating the amount of consumption and the optimal distribution of the network, as well as ways to rationalize the use of water by the consumer. The process of pumping domestic water starts from water treatment plants to be fed to the public distribution networks, then reaching a distribution network inside the building till it is provided to the user. The design of the water supply network inside the building is
... Show MoreOver the past few decades, the surveying fieldworks were usually carried out based on classical positioning methods for establishing horizontal and vertical geodetic networks. However, these conventional positioning techniques have many drawbacks such as time-consuming, too costly, and require massive effort. Thus, the Global Navigation Satellite System (GNSS) has been invented to fulfill the quickness, increase the accuracy, and overcome all the difficulties inherent in almost every surveying fieldwork. This research assesses the accuracy of local geodetic networks using different Global Navigation Satellite System (GNSS) techniques, such as Static, Precise Point Positioning, Post Processing Kinematic, Session method, a
... Show More<p>The demand for internet applications has increased rapidly. Providing quality of service (QoS) requirements for varied internet application is a challenging task. One important factor that is significantly affected on the QoS service is the transport layer. The transport layer provides end-to-end data transmission across a network. Currently, the most common transport protocols used by internet application are TCP (Transmission Control Protocol) and UDP (User Datagram Protocol). Also, there are recent transport protocols such as DCCP (data congestion control protocol), SCTP (stream congestion transmission protocol), and TFRC (TCP-friendly rate control), which are in the standardization process of Internet Engineering Task
... Show MoreMultimedia applications impose different QoS requirements (e.g., bounded end-to-end delay and jitter) and need an enhanced transport layer protocol that should handle packet loss, minimize errors, manage network congestion, and transmit efficiently. Across an IP network, the transport layer protocol provides data transmission and affects the QoS provided to the application on hand. The most common transport layer protocols used by Internet applications are TCP and UDP. There are also advanced transport layer protocols such as DCCP and TFRC. The authors evaluated the performance of UDP, DCCP, SCTP, and TFRC over wired networks for three traffic flows: data transmission, video streaming, and voice over IP. The evaluation criteria were thro
... Show MoreThis project sought to fabricate a flexible gas sensor based on a short functionalized multi-walled carbon nanotubes (f-MWCNTs) network for nitrogen dioxide gas detection. The network was prepared by filtration from the suspension (FFS) method and modified by coating with a layer of polypyrrole conductive polymer (PPy) prepared by the oxidative chemical polymerization to improve the properties of the network. The structural, optical, and morphological properties of the f-MWCNTs and f-MWCNTs/PPy network were studied using X-ray diffraction (XRD), Fourie-transform infrared (FTIR), with an AFM (atomic force microscopy). XRD proved that the structure of f-MWCNTs is unaffected by the synthesis procedure. The FTIR spectra verified the existence o
... Show MoreDeconstruction theory is a theory that appeared After construction theory, and it tends, through some key principles, to reach the purposive and the main meaning of the text by the means of different perspectives. In other words, deconstruction is a critical literary theory and a contemporary philosophical approach that work together to reach exact concept of the text, and this is achieved through reading and analyzing the text. Therefore, deconstruction has specified some principles so as to reach the exact meaning of the text through these different principles.
پێشەكی:
تیۆری هەڵوەشاندنەوە تیۆرێكە پاش بوونیادگەری سەریهەڵداوە و دەیەوێت لەڕ
... Show MoreResearch deals the crises of the global recession of the facets of different and calls for the need to think out of the ordinary theory and find the arguments of the theory to accommodate the evolution of life, globalization and technological change and the standard of living of individuals and the size of the disparity in income distribution is not on the national level, but also at the global level as well, without paying attention to the potential resistance for thought the usual classical, Where the greater the returns of factors of production, the consumption will increase, and that the marginal propensity to consume may rise and the rise at rates greater with slices of low-income (the mouths of the poor) wi
... Show More