Within the framework of big data, energy issues are highly significant. Despite the significance of energy, theoretical studies focusing primarily on the issue of energy within big data analytics in relation to computational intelligent algorithms are scarce. The purpose of this study is to explore the theoretical aspects of energy issues in big data analytics in relation to computational intelligent algorithms since this is critical in exploring the emperica aspects of big data. In this chapter, we present a theoretical study of energy issues related to applications of computational intelligent algorithms in big data analytics. This work highlights that big data analytics using computational intelligent algorithms generates a very high amount of energy, especially during the training phase. The transmission of big data between service providers, users and data centres emits carbon dioxide as a result of high power consumption. This chapter proposes a theoretical framework for big data analytics using computational intelligent algorithms that has the potential to reduce energy consumption and enhance performance. We suggest that researchers should focus more attention on the issue of energy within big data analytics in relation to computational intelligent algorithms, before this becomes a widespread and urgent problem.
A theoretical analysis of mixing in the secondary combustion chamber of ramjet is presented. Theoretical investigations were initiated to insight into the flow field of the mixing zone of the ramjet combustor and a computer program to calculate axisymmetric, reacting and inert flow was developed. The mathematical model of the mixing zone of ramjet comprises differential equations for: continuity, momentum, stagnation enthalpy, concentration, turbulence energy and its dissipation rate. The simultaneous solution of these equations by means of a finite-difference solution algorithm yields the values of the variable at all internal grid nodes.
The results showed that increasing air mass flow (0.32 to 0.64 kg/s) increases the development o
This work, deals with Kumaraswamy distribution. Kumaraswamy (1976, 1978) showed well known probability distribution functions such as the normal, beta and log-normal but in (1980) Kumaraswamy developed a more general probability density function for double bounded random processes, which is known as Kumaraswamy’s distribution. Classical maximum likelihood and Bayes methods estimator are used to estimate the unknown shape parameter (b). Reliability function are obtained using symmetric loss functions by using three types of informative priors two single priors and one double prior. In addition, a comparison is made for the performance of these estimators with respect to the numerical solution which are found using expansion method. The
... Show MoreDengke Naniura is a traditional food from Sumatera Utara, Indonesia, that is produced through fermenting process, and this food is believed to contain high probiotics. The objective of the current research is to determine the potential of LAB as a probiotic that has been obtained from Dengke Naniura. Dengke Naniura was traditionally prepared from Cyprinus carpio. Four LABs have been successfully isolated from Dengke Naniura, such as D7DA3, D7B3, D7DBF and D7DN3. Those four LAB isolates were identified as Lactobacillus sp. This result has been confirmed by the non-spore forming bacterium, non-motile, and Gram-positive. Also, it has been supported by biochemical test, for the example Voges Proskauer, catalase test, Methyl
... Show MoreIn data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me
The study aimed at identifying the strategic gaps in the actual reality of the management of public organizations investigated to determine the strategy used based on the study model. The study relied on the variable of the general organization strategy in its dimensions (the general organization strategy, the organization's political strategy and the defense strategy of the organization) The sample of the study was (General Directorate of Traffic, Civil Status Directorate and Civil Defense Directorate), formations affiliated to the Ministry of the Interior, for the importance of the activity carried out by these public organizations by providing them In order to translate the answers into a quantitative expression in the analysi
... Show MoreA skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.
The research aims to examine the integration effect among resource consumption accounting (RCA) system and the enterprise resource planning (ERP) on both costs reduction and quality improvement. The study questioner form distributed to two different respondents as the unit of analysis. The research reached various conclusions most important of which is the integration relationship can help solve the special difficulties in managing the economic unit data. Moreover, the integration provides a clear picture of the causal relationships between resources, resource quantities, and associated costs
Biodiesel production from microalgae depends on the biomass and lipid production. Both biomass and lipid accumulation is controlled by several factors. The effect of various culture media (BG11, BBM, and Urea), nutrients stress [nitrogen (N), phosphorous (P), magnesium (Mg) and carbonate (CO3)] and gamma (γ) radiation on the growth and lipid accumulation of Dictyochloropsis splendida were investigated. The highest biomass and lipid yield of D. splendida were achieved on BG11 medium. Cultivation of D. splendida in a medium containing 3000 mg L−1 N, or 160 mg L−1 P, or 113 mg L−1 Mg, or 20 mg L-1 CO3, led to enhanced growth rate. While u
... Show More