Microservice architecture offers many advantages, especially for business applications, due to its flexibility, expandability, and loosely coupled structure for ease of maintenance. However, there are several disadvantages that stem from the features of microservices, such as the fact that microservices are independent in nature can hinder meaningful communication and make data synchronization more challenging. This paper addresses the issues by proposing a containerized microservices in an asynchronous event-driven architecture. This architecture encloses microservices in containers and implements an event manager to keep track of all the events in an event log to reduce errors in the application. Experiment results show a decline in response time compared to two other benchmark architectures, as well as a lessening in error rate.
The need to exchange large amounts of real-time data is constantly increasing in wireless communication. While traditional radio transceivers are not cost-effective and their components should be integrated, software-defined radio (SDR) ones have opened up a new class of wireless technologies with high security. This study aims to design an SDR transceiver was built using one type of modulation, which is 16 QAM, and adding a security subsystem using one type of chaos map, which is a logistic map, because it is a very simple nonlinear dynamical equations that generate a random key and EXCLUSIVE OR with the originally transmitted data to protect data through the transmission. At th
... Show MoreResearchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat
... Show Moreيعد التقطيع الصوري من الاهداف الرئيسة والضرورية في المعالجات الصورية للصور الرقمية، فهو يسعى الى تجزئة الصور المدروسة الى مناطق متعددة اكثر نفعاً تلخص فيها المناطق ذات الافادة لصور الاقمار الصناعية، وهي صور متعددة الاطياف ومجهزة من الاقمار الصناعية باستخدام مبدأ الاستشعار عن بعد والذي اصبح من المفاهيم المهمة التي تُعتمد تطبيقاته في اغلب ضروريات الحياة اليومية، وخاصة بعد التطورات المتسارعة التي شهد
... Show MoreOffline handwritten signature is a type of behavioral biometric-based on an image. Its problem is the accuracy of the verification because once an individual signs, he/she seldom signs the same signature. This is referred to as intra-user variability. This research aims to improve the recognition accuracy of the offline signature. The proposed method is presented by using both signature length normalization and histogram orientation gradient (HOG) for the reason of accuracy improving. In terms of verification, a deep-learning technique using a convolution neural network (CNN) is exploited for building the reference model for a future prediction. Experiments are conducted by utilizing 4,000 genuine as well as 2,000 skilled forged signatu
... Show MoreThe research aims to identify the role of the dimensions of financial inclusion in achieving the competitive advantage by An exploratory research of the views of a sample of customers of the 20 Algerian commercial banks, And the relationship between its dimensions (Access dimension, Usage dimension, Quality) And competitive advantage. This research is based on the analytical descriptive approach. The questionnaire was adopted as a main tool in collecting data and information on the sample of 377.
The The research showed several results, the most important of which is a strong correlation between the dimensions of the three financial inclusion combined and the competitive advantage of the Algerian commercial banks, and explained t
... Show MoreThis paper deals with constructing mixed probability distribution from exponential with scale parameter (β) and also Gamma distribution with (2,β), and the mixed proportions are ( .first of all, the probability density function (p.d.f) and also cumulative distribution function (c.d.f) and also the reliability function are obtained. The parameters of mixed distribution, ( ,β) are estimated by three different methods, which are maximum likelihood, and Moments method,as well proposed method (Differential Least Square Method)(DLSM).The comparison is done using simulation procedure, and all the results are explained in tables.
The estimation of the parameters of linear regression is based on the usual Least Square method, as this method is based on the estimation of several basic assumptions. Therefore, the accuracy of estimating the parameters of the model depends on the validity of these hypotheses. The most successful technique was the robust estimation method which is minimizing maximum likelihood estimator (MM-estimator) that proved its efficiency in this purpose. However, the use of the model becomes unrealistic and one of these assumptions is the uniformity of the variance and the normal distribution of the error. These assumptions are not achievable in the case of studying a specific problem that may include complex data of more than one model. To
... Show MoreThe main goal of this paper is to introduce and study a new concept named d*-supplemented which can be considered as a generalization of W- supplemented modules and d-hollow module. Also, we introduce a d*-supplement submodule. Many relationships of d*-supplemented modules are studied. Especially, we give characterizations of d*-supplemented modules and relationship between this kind of modules and other kind modules for example every d-hollow (d-local) module is d*-supplemented and by an example we show that the converse is not true.
The research aims to identify the importance of applying resource consumption accounting in the Iraqi industrial environment in general, and oil in particular, and its role in reducing the costs of activities by excluding and isolating idle energy costs, as the research problem represents that the company faces deficiencies and challenges in applying strategic cost tools. The research was based on The hypothesis that the application of resource consumption accounting will lead to the provision of appropriate information for the company through the allocation of costs properly by resource consumption accounting and then reduce the costs of activities. To prove the hypothesis of the research, the Light Derivatives Authority - Al-Dora Refin
... Show MoreIn this paper, an algorithm for binary codebook design has been used in vector quantization technique, which is used to improve the acceptability of the absolute moment block truncation coding (AMBTC) method. Vector quantization (VQ) method is used to compress the bitmap (the output proposed from the first method (AMBTC)). In this paper, the binary codebook can be engender for many images depending on randomly chosen to the code vectors from a set of binary images vectors, and this codebook is then used to compress all bitmaps of these images. The chosen of the bitmap of image in order to compress it by using this codebook based on the criterion of the average bitmap replacement error (ABPRE). This paper is suitable to reduce bit rates
... Show More