Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for many applications dismissing the use of DL. Having sufficient data is the first step toward any successful and trustworthy DL application. This paper presents a holistic survey on state-of-the-art techniques to deal with training DL models to overcome three challenges including small, imbalanced datasets, and lack of generalization. This survey starts by listing the learning techniques. Next, the types of DL architectures are introduced. After that, state-of-the-art solutions to address the issue of lack of training data are listed, such as Transfer Learning (TL), Self-Supervised Learning (SSL), Generative Adversarial Networks (GANs), Model Architecture (MA), Physics-Informed Neural Network (PINN), and Deep Synthetic Minority Oversampling Technique (DeepSMOTE). Then, these solutions were followed by some related tips about data acquisition needed prior to training purposes, as well as recommendations for ensuring the trustworthiness of the training dataset. The survey ends with a list of applications that suffer from data scarcity, several alternatives are proposed in order to generate more data in each application including Electromagnetic Imaging (EMI), Civil Structural Health Monitoring, Medical imaging, Meteorology, Wireless Communications, Fluid Mechanics, Microelectromechanical system, and Cybersecurity. To the best of the authors’ knowledge, this is the first review that offers a comprehensive overview on strategies to tackle data scarcity in DL.
Crime is a threat to any nation’s security administration and jurisdiction. Therefore, crime analysis becomes increasingly important because it assigns the time and place based on the collected spatial and temporal data. However, old techniques, such as paperwork, investigative judges, and statistical analysis, are not efficient enough to predict the accurate time and location where the crime had taken place. But when machine learning and data mining methods were deployed in crime analysis, crime analysis and predication accuracy increased dramatically. In this study, various types of criminal analysis and prediction using several machine learning and data mining techniques, based o
أثبتت الشبكات المحددة بالبرمجيات (SDN) تفوقها في معالجة مشاكل الشبكة العادية مثل قابلية التوسع وخفة الحركة والأمن. تأتي هذه الميزة من SDN بسبب فصل مستوى التحكم عن مستوى البيانات. على الرغم من وجود العديد من الأوراق والدراسات التي تركز على إدارة SDN، والرصد، والتحكم، وتحسين QoS، إلا أن القليل منها يركز على تقديم ما يستخدمونه لتوليد حركة المرور وقياس أداء الشبكة. كما أن المؤلفات تفتقر إلى مقارنات بين الأدوات والأ
... Show MoreSpecial exercises in individual games are an important pillar in learning their basic skills. The aim of the research is to prepare special exercises using tools and their effect on learning the skill of landing with Salto backward tucked to stand - knowing the effect of special exercises using tools and their effect on learning the skill of landing with Salto backward tucked to stand on the horizontal bar. Either the research assumes the existence of significant differences in the pre- and post-tests in learning the skill of landing with Salto backward tucked to stand on the horizontal bar in favor of the post-test. The researchers used the experimental method with a single sample design to suit the research problem, as the researc
... Show MoreThe deep learning algorithm has recently achieved a lot of success, especially in the field of computer vision. This research aims to describe the classification method applied to the dataset of multiple types of images (Synthetic Aperture Radar (SAR) images and non-SAR images). In such a classification, transfer learning was used followed by fine-tuning methods. Besides, pre-trained architectures were used on the known image database ImageNet. The model VGG16 was indeed used as a feature extractor and a new classifier was trained based on extracted features.The input data mainly focused on the dataset consist of five classes including the SAR images class (houses) and the non-SAR images classes (Cats, Dogs, Horses, and Humans). The Conv
... Show More<p> Traditionally, wireless networks and optical fiber Networks are independent of each other. Wireless networks are designed to meet specific service requirements, while dealing with weak physical transmission, and maximize system resources to ensure cost effectiveness and satisfaction for the end user. In optical fiber networks, on the other hand, search efforts instead concentrated on simple low-cost, future-proofness against inheritance and high services and applications through optical transparency. The ultimate goal of providing access to information when needed, was considered significantly. Whatever form it is required, not only increases the requirement sees technology convergence of wireless and optical networks but
... Show MoreDeep learning convolution neural network has been widely used to recognize or classify voice. Various techniques have been used together with convolution neural network to prepare voice data before the training process in developing the classification model. However, not all model can produce good classification accuracy as there are many types of voice or speech. Classification of Arabic alphabet pronunciation is a one of the types of voice and accurate pronunciation is required in the learning of the Qur’an reading. Thus, the technique to process the pronunciation and training of the processed data requires specific approach. To overcome this issue, a method based on padding and deep learning convolution neural network is proposed to
... Show MoreSmart systems are the trend for modern organizations and should meet the quality of services that expect to produce. Internet of Everything (IoE) helped smart systems to adopt microcontrollers for improving the performance. Analyzing and controlling data in such a system are critical issues. In this study, a survey of IoE systems conducted to show how to apply a suitable model that meets such system requirements. The analysis of some microcontroller boards is explored based on known features. Factors for applying IoE devices have been defined such as connectivity, power consumption, compatibility, and cost. Different methods have been explained as an overview of applying IoE systems. Further, different approaches for applying IoE technology
... Show MoreBlogs have emerged as a powerful technology tool for English as a Foreign Language (EFL) classrooms. This literature review aims to provide an overview of the use of blogs as learning tools in EFL classrooms. The study examines the benefits and challenges of using blogs for language learning and the different types of blogs that can be used for language learning. It provides suggestions for teachers interested in using blogs as learning tools in their EFL classrooms. The findings suggest that blogs are a valuable and effective tool for language learning, particularly in promoting collaboration, communication, and motivation.