In recent years, the field of research around the congestion problem of 4G and 5G networks has grown, especially those based on artificial intelligence (AI). Although 4G with LTE is seen as a mature technology, there is a continuous improvement in the infrastructure that led to the emergence of 5G networks. As a result of the large services provided in industries, Internet of Things (IoT) applications and smart cities, which have a large amount of exchanged data, a large number of connected devices per area, and high data rates, have brought their own problems and challenges, especially the problem of congestion. In this context, artificial intelligence (AI) models can be considered as one of the main techniques that can be used to solve network congestion problems. Since AI technologies are able to extract relevant features from data and deal with huge amounts of data, the integration of communication networks with AI to solve the congestion problem appears promising, and the research requires exploration. This paper provides a review of how AI technologies can be used to solve the congestion problem in 4G and 5G networks. We examined previous studies addressing the problem of congestion in networks, such as congestion prediction, congestion control, congestion avoidance, and TCP development for congestion control. Finally, we discuss the future vision of using AI technologies in 4G and 5G networks to solve congestion problems and identify research issues that need further study.
the banks are one of the public services that must be available in the city to ensure easy financial dealings between citizens and state departments and between the state departments with each other and between the citizens themselves and to ensure easy access to it, so it is very important to choose the best location for the bank, which can serve the largest number of The population achieves easy access. Due to the difficulty of obtaining accurate information dealing with the exact coordinates and according to the country's specific projection, the researcher will resort to the default work using some of the files available in the arcview program
Twosimple, sensitive,accurate, and precise spectrophotometric methods have been developed for the determination of chlorpromazine – HCl in pure form and pharmaceutical formulation. The first method involved treatment of cited drug with a measured excess of permanganate in acid medium and the unreacted oxidant was measured at 525 nm. The second method involves the reaction of the drug with potassium permanganate in the presence of sodium hydroxide to produce a bluish – green colored manganite which is measurable at 610nm. All the experimental variables affecting the development of the manganite ions were investigatedand conditions were optimized. Working linearity ranges were 5-45 µg.mL-1and 1-20 µg.mL-1 by two methods respectively. Th
... Show MoreDifferent solvents (light naphtha, n-heptane, and n-hexane) are used to treat Iraqi Atmospheric oil residue by the deasphalting process. Oil residue from Al-Dura refinery with specific gravity 0.9705, API 14.9, and 0.5 wt. % sulfur content was used. Deasphalting oil (DAO) was examined on a laboratory scale by using solvents with different operation conditions (temperature, concentration of solvent, solvent to oil ratio, and duration time). This study investigates the effects of these parameters on asphaltene yield. The results show that an increase in temperature for all solvents increases the extraction of asphaltene yield. The higher reduction in asphaltene content is obtained with hexane solvent at operating conditions of (90 °C
... Show MoreElectrocardiogram (ECG) is an important physiological signal for cardiac disease diagnosis. With the increasing use of modern electrocardiogram monitoring devices that generate vast amount of data requiring huge storage capacity. In order to decrease storage costs or make ECG signals suitable and ready for transmission through common communication channels, the ECG data
volume must be reduced. So an effective data compression method is required. This paper presents an efficient technique for the compression of ECG signals. In this technique, different transforms have been used to compress the ECG signals. At first, a 1-D ECG data was segmented and aligned to a 2-D data array, then 2-D mixed transform was implemented to compress the
This study was aimed to determine a phytotoxicity experiment with kerosene as a model of a total petroleum hydrocarbon (TPHs) as Kerosene pollutant at different concentrations (1% and 6%) with aeration rate (0 and 1 L/min) and retention time (7, 14, 21, 28 and 42 days), was carried out in a subsurface flow system (SSF) on the Barley wetland. It was noted that greatest elimination 95.7% recorded at 1% kerosene levels and aeration rate 1L / min after a period of 42 days of exposure; whereas it was 47% in the control test without plants. Furthermore, the percent of elimination efficiencies of hydrocarbons from the soil was ranged between 34.155%-95.7% for all TPHs (Kerosene) concentrations at aeration rate (0 and 1 L/min). The Barley c
... Show MoreEquation Boizil used to Oatae approximate value of bladder pressure for 25 healthy people compared with Amqas the Alrotinahh ways used an indirect the catheter Bashaddam and found this method is cheap and harmless and easy
The ability of pulverized walnut-shell to remove oil from aqueous solutions has been studied. It involves two-phase process which consists of using walnut-shell as a filtering bed for the accumulation and adsorption of oil onto its surface. Up to 96% oil removal from synthetic wastewater samples was achieved while tests results showed that 75% of oil can be removed from the actual wastewater discharged from Al- Duara refinery in the south of Baghdad.
Simulation experiments are a means of solving in many fields, and it is the process of designing a model of the real system in order to follow it and identify its behavior through certain models and formulas written according to a repeating software style with a number of iterations. The aim of this study is to build a model that deals with the behavior suffering from the state of (heteroskedasticity) by studying the models (APGARCH & NAGARCH) using (Gaussian) and (Non-Gaussian) distributions for different sample sizes (500,1000,1500,2000) through the stage of time series analysis (identification , estimation, diagnostic checking and prediction). The data was generated using the estimations of the parameters resulting f
... Show MoreLowpass spatial filters are adopted to match the noise statistics of the degradation seeking
good quality smoothed images. This study imply different size and shape of smoothing
windows. The study shows that using a window square frame shape gives good quality
smoothing and at the same time preserving a certain level of high frequency components in
comparsion with standard smoothing filters.
In this paper a decoder of binary BCH code is implemented using a PIC microcontroller for code length n=127 bits with multiple error correction capability, the results are presented for correcting errors up to 13 errors. The Berkelam-Massey decoding algorithm was chosen for its efficiency. The microcontroller PIC18f45k22 was chosen for the implementation and programmed using assembly language to achieve highest performance. This makes the BCH decoder implementable as a low cost module that can be used as a part of larger systems. The performance evaluation is presented in terms of total number of instructions and the bit rate.