Ferritin is a key organizer of protected deregulation, particularly below risky hyperferritinemia, by straight immune-suppressive and pro-inflammatory things. , We conclude that there is a significant association between levels of ferritin and the harshness of COVID-19. In this paper we introduce a semi- parametric method for prediction by making a combination between NN and regression models. So, two methodologies are adopted, Neural Network (NN) and regression model in design the model; the data were collected from مستشفى دار التمريض الخاص for period 11/7/2021- 23/7/2021, we have 100 person, With COVID 12 Female & 38 Male out of 50, while 26 Female & 24 Male non COVID out of 50. The input variables of the NN model are identified as the ferritin and a gender variable. The higher results precision was attained by the multilayer perceptron (MLP) networks when we applied the explanatory variables as the inputs with one hidden layer, which covers 3 neurons, as the planned many hidden layers are with one output of the fitting NN model which is use in stages of training and validation beside the actual data. We used a portion of the actual data to verify the behaviour of the developed models, we find that only one observation is false prediction value. This mean that the estimation model has significant parameters to forecast the type of Covid cases (Covid or no Covid) .
Most of the water pollutants with dyes are leftovers from industries, including textiles, wool and others. There are many ways to remove dyes such as sorption, oxidation, coagulation, filtration, and biodegradation, Chlorination, ozonation, chemical precipitation, adsorption, electrochemical processes, membrane approaches, and biological treatment are among the most widely used technologies for removing colors from wastewater. Dyes are divided into two types: natural dyes and synthetic dyes.
Bootstrap is one of an important re-sampling technique which has given the attention of researches recently. The presence of outliers in the original data set may cause serious problem to the classical bootstrap when the percentage of outliers are higher than the original one. Many methods are proposed to overcome this problem such Dynamic Robust Bootstrap for LTS (DRBLTS) and Weighted Bootstrap with Probability (WBP). This paper try to show the accuracy of parameters estimation by comparison the results of both methods. The bias , MSE and RMSE are considered. The criterion of the accuracy is based on the RMSE value since the method that provide us RMSE value smaller than other is con
... Show MoreThis research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear
... Show MoreThis research is concerned with the re-analysis of optical data (the imaginary part of the dielectric function as a function of photon energy E) of a-Si:H films prepared by Jackson et al. and Ferlauto et al. through using nonlinear regression fitting we estimated the optical energy gap and the deviation from the Tauc model by considering the parameter of energy photon-dependence of the momentum matrix element of the p as a free parameter by assuming that density of states distribution to be a square root function. It is observed for films prepared by Jackson et al. that the value of the parameter p for the photon energy range is is close to the value assumed by the Cody model and the optical gap energy is which is also close to the value
... Show MoreThe most important issue that 21-century in knowledge organization try successfully to face and solve is the determination of the ways and the processes through which they can measure and assess the intellectual capital (IC). In spite of the importance of the human capital in the knowledge organization, The accounting as an information systems, does not give a great deal of consideration to the human capital, and does not treat investment in it as an original factor, but it shows it on the base of salaries and payrolls that is appears in the financial statements as a revenue expenditure. As a result of that the financial statement are not a true expression of the actual status and then some of the decisions taken under the present circum
... Show MoreIn this paper, third order non-polynomial spline function is used to solve 2nd kind Volterra integral equations. Numerical examples are presented to illustrate the applications of this method, and to compare the computed results with other known methods.
Segmentation is the process of partition digital images into different parts depending on texture, color, or intensity, and can be used in different fields in order to segment and isolate the area to be partitioned. In this work images of the Moon obtained through observations in Astronomy and space dep. College of science university of Baghdad by ( Toward space telescopes and widespread used of a CCD camera) . Different segmentation methods were used to segment lunar craters. Different celestial objects cause craters when they crash into the surface of the Moon like asteroids and meteorites. Thousands of craters appears on the Moon's surface with ranges in size from meter to many kilometers, it provide insights into the age and geology
... Show MoreTwo unsupervised classifiers for optimum multithreshold are presented; fast Otsu and k-means. The unparametric methods produce an efficient procedure to separate the regions (classes) by select optimum levels, either on the gray levels of image histogram (as Otsu classifier), or on the gray levels of image intensities(as k-mean classifier), which are represent threshold values of the classes. In order to compare between the experimental results of these classifiers, the computation time is recorded and the needed iterations for k-means classifier to converge with optimum classes centers. The variation in the recorded computation time for k-means classifier is discussed.
Optimizing system performance in dynamic and heterogeneous environments and the efficient management of computational tasks are crucial. This paper therefore looks at task scheduling and resource allocation algorithms in some depth. The work evaluates five algorithms: Genetic Algorithms (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Firefly Algorithm (FA) and Simulated Annealing (SA) across various workloads achieved by varying the task-to-node ratio. The paper identifies Finish Time and Deadline as two key performance metrics for gauging the efficacy of an algorithm, and a comprehensive investigation of the behaviors of these algorithms across different workloads was carried out. Results from the experiment
... Show More