Optimization is essentially the art, science and mathematics of choosing the best among a given set of finite or infinite alternatives. Though currently optimization is an interdisciplinary subject cutting through the boundaries of mathematics, economics, engineering, natural sciences, and many other fields of human Endeavour it had its root in antiquity. In modern day language the problem mathematically is as follows - Among all closed curves of a given length find the one that closes maximum area. This is called the Isoperimetric problem. This problem is now mentioned in a regular fashion in any course in the Calculus of Variations. However, most problems of antiquity came from geometry and since there were no general methods to solve such problems, each one of them was solved by very different approaches.
Within the framework of big data, energy issues are highly significant. Despite the significance of energy, theoretical studies focusing primarily on the issue of energy within big data analytics in relation to computational intelligent algorithms are scarce. The purpose of this study is to explore the theoretical aspects of energy issues in big data analytics in relation to computational intelligent algorithms since this is critical in exploring the emperica aspects of big data. In this chapter, we present a theoretical study of energy issues related to applications of computational intelligent algorithms in big data analytics. This work highlights that big data analytics using computational intelligent algorithms generates a very high amo
... Show MoreIn this study, the first kind Bessel function was used to solve Kepler equation for an elliptical orbiting satellite. It is a classical method that gives a direct solution for calculation of the eccentric anomaly. It was solved for one period from (M=0-360)° with an eccentricity of (e=0-1) and the number of terms from (N=1-10). Also, the error in the representation of the first kind Bessel function was calculated. The results indicated that for eccentricity of (0.1-0.4) and (N = 1-10), the values of eccentric anomaly gave a good result as compared with the exact solution. Besides, the obtained eccentric anomaly values were unaffected by increasing the number of terms (N = 6-10) for eccentricities (0.8 and 0.9). The Bessel
... Show MoreIn the present study a new synthesis method has been introduced for the decoration of platinum(Pt) on the functionalized graphene nanoplatelet (GNP) and also highlighted the preparation method of nanofluids. GNP–Pt uniform nanocomposite was produced from a simple chemical reaction procedure, which included acid treatment for functionalization of GNP. The surface characterization was performed by various techniques such as XRD, FESEMand TEM. The effective thermal conductivity, density, viscosity, specific heat capacity and stability of functionalized GNP–Pt water based nanofluids were investigated in different instruments. The GNP–Pt hybrid nanofluids were prepared by dispersing the nanocomposite in base fluid without adding any surfac
... Show MoreJournal of Theoretical and Applied Information Technology is a peer-reviewed electronic research papers & review papers journal with aim of promoting and publishing original high quality research dealing with theoretical and scientific aspects in all disciplines of IT (Informaiton Technology
This research dealt with the analysis of murder crime data in Iraq in its temporal and spatial dimensions, then it focused on building a new model with an algorithm that combines the characteristics associated with time and spatial series so that this model can predict more accurately than other models by comparing them with this model, which we called the Combined Regression model (CR), which consists of merging two models, the time series regression model with the spatial regression model, and making them one model that can analyze data in its temporal and spatial dimensions. Several models were used for comparison with the integrated model, namely Multiple Linear Regression (MLR), Decision Tree Regression (DTR), Random Forest Reg
... Show MoreThe aim of this research is to use robust technique by trimming, as the analysis of maximum likelihood (ML) often fails in the case of outliers in the studied phenomenon. Where the (MLE) will lose its advantages because of the bad influence caused by the Outliers. In order to address this problem, new statistical methods have been developed so as not to be affected by the outliers. These methods have robustness or resistance. Therefore, maximum trimmed likelihood: (MTL) is a good alternative to achieve more results. Acceptability and analogies, but weights can be used to increase the efficiency of the resulting capacities and to increase the strength of the estimate using the maximum weighted trimmed likelihood (MWTL). In order to perform t
... Show MoreThe objective review is to inspect the involvement of Interleukin-6 (IL-6) in rheumatoid arthritis (RA) and to highlight the role of IL-6 and its variants in the pathogenesis of RA and response to anti-IL-6 agents. Several genetic and environmental risk factors and infectious agents contributed to the development of RA. Interleukin-6 is engaged in self-targeted immunity by modifying the equilibrium between T regulatory (T-reg) and T helper-17 (Th-17) cells. The evidences reported that IL-6 parti
In this Paper, we proposed two new predictor corrector methods for solving Kepler's equation in hyperbolic case using quadrature formula which plays an important and significant rule in the evaluation of the integrals. The two procedures are developed that, in two or three iterations, solve the hyperbolic orbit equation in a very efficient manner, and to an accuracy that proves to be always better than 10-15. The solution is examined with and with grid size , using the first guesses hyperbolic eccentric anomaly is and , where is the eccentricity and is the hyperbolic mean anomaly.
Enhancing quality image fusion was proposed using new algorithms in auto-focus image fusion. The first algorithm is based on determining the standard deviation to combine two images. The second algorithm concentrates on the contrast at edge points and correlation method as the criteria parameter for the resulted image quality. This algorithm considers three blocks with different sizes at the homogenous region and moves it 10 pixels within the same homogenous region. These blocks examine the statistical properties of the block and decide automatically the next step. The resulted combined image is better in the contras
... Show More