Modern civilization increasingly relies on sustainable and eco-friendly data centers as the core hubs of intelligent computing. However, these data centers, while vital, also face heightened vulnerability to hacking due to their role as the convergence points of numerous network connection nodes. Recognizing and addressing this vulnerability, particularly within the confines of green data centers, is a pressing concern. This paper proposes a novel approach to mitigate this threat by leveraging swarm intelligence techniques to detect prospective and hidden compromised devices within the data center environment. The core objective is to ensure sustainable intelligent computing through a colony strategy. The research primarily focusses on the applying sigmoid fish swarm optimization (SiFSO) for early compromised device detection and subsequently alerting other network nodes. Additionally, our data center implements an innovative ant skyscape architecture (ASA) cooling mechanism, departing from traditional, unsustainable cooling strategies that harm the environment. To validate the effectiveness of these approaches, extensive simulations were conducted. The evaluations primarily revolved around the fish colony’s ability to detect compromised devices, focusing on source tracing, realistic modelling, and an impressive 98% detection accuracy rate under ASA cooling solution with 0.16 ºC within 1,300 second. Compromised devices pose a substantial risk to green data centers, as attackers could manipulate and disrupt network equipment. Therefore, incorporating cyber enhancements into the green data center concept is imperative to foster more adaptable and efficient smart networks.
Permeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas
The using of the parametric models and the subsequent estimation methods require the presence of many of the primary conditions to be met by those models to represent the population under study adequately, these prompting researchers to search for more flexible parametric models and these models were nonparametric, many researchers, are interested in the study of the function of permanence and its estimation methods, one of these non-parametric methods.
For work of purpose statistical inference parameters around the statistical distribution for life times which censored data , on the experimental section of this thesis has been the comparison of non-parametric methods of permanence function, the existence
... Show MoreThe non static chain is always the problem of static analysis so that explained some of theoretical work, the properties of statistical regression analysis to lose when using strings in statistic and gives the slope of an imaginary relation under consideration. chain is not static can become static by adding variable time to the multivariate analysis the factors to remove the general trend as well as variable placebo seasons to remove the effect of seasonal .convert the data to form exponential or logarithmic , in addition to using the difference repeated d is said in this case it integrated class d. Where the research contained in the theoretical side in parts in the first part the research methodology ha
... Show MoreBusiness organizations have faced many challenges in recent times, most important of which is information technology, because it is widely spread and easy to use. Its use has led to an increase in the amount of data that business organizations deal with an unprecedented manner. The amount of data available through the internet is a problem that many parties seek to find solutions for. Why is it available there in this huge amount randomly? Many expectations have revealed that in 2017, there will be devices connected to the internet estimated at three times the population of the Earth, and in 2015 more than one and a half billion gigabytes of data was transferred every minute globally. Thus, the so-called data mining emerged as a
... Show Morein this paper we adopted ways for detecting edges locally classical prewitt operators and modification it are adopted to perform the edge detection and comparing then with sobel opreators the study shows that using a prewitt opreators
Most Internet-tomography problems such as shared congestion detection depend on network measurements. Usually, such measurements are carried out in multiple locations inside the network and relied on local clocks. These clocks usually skewed with time making these measurements unsynchronized and thereby degrading the performance of most techniques. Recently, shared congestion detection has become an important issue in many computer networked applications such as multimedia streaming and
peer-to-peer file sharing. One of the most powerful techniques that employed in literature is based on Discrete Wavelet Transform (DWT) with cross-correlation operation to determine the state of the congestion. Wavelet transform is used as a de-noisin
With the high usage of computers and networks in the current time, the amount of security threats is increased. The study of intrusion detection systems (IDS) has received much attention throughout the computer science field. The main objective of this study is to examine the existing literature on various approaches for Intrusion Detection. This paper presents an overview of different intrusion detection systems and a detailed analysis of multiple techniques for these systems, including their advantages and disadvantages. These techniques include artificial neural networks, bio-inspired computing, evolutionary techniques, machine learning, and pattern recognition.
Objective: We hypothesized that attacking cancer cells by combining various modes of action can hinder them from taking the chance to evolve resistance to treatment. Incorporation of photodynamic therapy (PDT) with oncolytic virotherapy might be a promising dual approach to cancer treatment. Methods: NDV AMHA1 strain as virotherapy in integration with aminolaevulinic acid (ALA) using low power He-Ne laser as PDT in the existing work was examined against breast cancer cells derived from Iraqi cancer patients named (AMJ13). This combination was evaluated using Chou–Talalay analysis. Results: The results showed an increased killing rate when using both 0.01 and 0.1 Multiplicity of infection (MOI) of the virus when combined with a dose of 617
... Show MoreApium graveolens has been utilized for a multitude of purposes due to its diverse pharmacological characteristics. On the other hand, little is known about how the fatty acids (saturated and unsaturated) terpenes and steroids found in Iraqi Apium graveolens affect the human cancer cells. The purpose of this study was to examine the effects of Iraqi Apium graveolens petroleum ether extract on the human prostate cancer cell line (PC3). Subsidiary extraction and phytochemical analysis by GC/MS were performed.The dry and fresh aerial parts (leaves and stem) of Apium graveolens were extracted using a Soxhlet device with 70 % ethanol, then fractionated with petroleum ether. Then Gas Chromatography System was used to identify the bioactive
... Show More