Transportability refers to the ease with which people, goods, or services may be transferred. When transportability is high, distance becomes less of a limitation for activities. Transportation networks are frequently represented by a set of locations and a set of links that indicate the connections between those places which is usually called network topology. Hence, each transmission network has a unique topology that distinguishes its structure. The most essential components of such a framework are the network architecture and the connection level. This research aims to demonstrate the efficiency of the road network in the Al-Karrada area which is located in the Baghdad city. The analysis based on a quantitative evaluation using graph th
... Show MoreThe evolution of the Internet of things (IoT) led to connect billions of heterogeneous physical devices together to improve the quality of human life by collecting data from their environment. However, there is a need to store huge data in big storage and high computational capabilities. Cloud computing can be used to store big data. The data of IoT devices is transferred using two types of protocols: Message Queuing Telemetry Transport (MQTT) and Hypertext Transfer Protocol (HTTP). This paper aims to make a high performance and more reliable system through efficient use of resources. Thus, load balancing in cloud computing is used to dynamically distribute the workload across nodes to avoid overloading any individual r
... Show MoreFuture generations of wireless communications systems are expected to evolve toward allowing massive ubiquitous connectivity and achieving ultra-reliable and low-latency communications (URLLC) with extremely high data rates. Massive multiple-input multiple-output (m-MIMO) is a crucial transmission technique to fulfill the demands of high data rates in the upcoming wireless systems. However, obtaining a downlink (DL) training sequence (TS) that is feasible for fast channel estimation, i.e., meeting the low-latency communications required by future generations of wireless systems, in m-MIMO with frequency-division-duplex (FDD) when users have different channel correlations is very challenging. Therefore, a low-complexity solution for
... Show MoreIn this work, a fiber-optic biomedical sensor was manufactured to detect hemoglobin percentages in the blood. SPR-based coreless optical fibers were developed and implemented using single and multiple optical fibers. It was also used to calculate refractive indices and concentrations of hemoglobin in blood samples. An optical fiber, with a thickness of 40 nanometers, was deposited on gold metal for the sensing area to increase the sensitivity of the sensor. The optical fiber used in this work has a diameter of 125μm, no core, and is made up of a pure silica glass rod and an acrylate coating. The length of the fiber was 4cm removed buffer and the splicing process was done. It is found in practice that when the sensitive refractive i
... Show MoreThis study was conducted to determine the Immuno – globulins and complements quantitatively. The result revealed that the concentration of Immunoglobulin M(IgM) was increased significantly in patient group comparing with control group . The concentration of complement protein C4 was increased significantly in patient group comparing with control group.IgG of Candida albicans was detected by using ELISA Technique, the result indicated also that this antibody was found in 628% of the women who infected with Vulvovaginal Candidiasis. The sensitivity and specificity of the test were 63% and 89% respectively.
The question of estimation took a great interest in some engineering, statistical applications, various applied, human sciences, the methods provided by it helped to identify and accurately the many random processes.
In this paper, methods were used through which the reliability function, risk function, and estimation of the distribution parameters were used, and the methods are (Moment Method, Maximum Likelihood Method), where an experimental study was conducted using a simulation method for the purpose of comparing the methods to show which of these methods are competent in practical application This is based on the observations generated from the Rayleigh logarithmic distribution (RL) with sample sizes
... Show MoreThis deals with estimation of Reliability function and one shape parameter (?) of two- parameters Burr – XII , when ?(shape parameter is known) (?=0.5,1,1.5) and also the initial values of (?=1), while different sample shze n= 10, 20, 30, 50) bare used. The results depend on empirical study through simulation experiments are applied to compare the four methods of estimation, as well as computing the reliability function . The results of Mean square error indicates that Jacknif estimator is better than other three estimators , for all sample size and parameter values
Each phenomenon contains several variables. Studying these variables, we find mathematical formula to get the joint distribution and the copula that are a useful and good tool to find the amount of correlation, where the survival function was used to measure the relationship of age with the level of cretonne in the remaining blood of the person. The Spss program was also used to extract the influencing variables from a group of variables using factor analysis and then using the Clayton copula function that is used to find the shared binary distributions using multivariate distributions, where the bivariate distribution was calculated, and then the survival function value was calculated for a sample size (50) drawn from Yarmouk Ho
... Show More