In many oil-recovery systems, relative permeabilities (kr) are essential flow factors that affect fluid dispersion and output from petroleum resources. Traditionally, taking rock samples from the reservoir and performing suitable laboratory studies is required to get these crucial reservoir properties. Despite the fact that kr is a function of fluid saturation, it is now well established that pore shape and distribution, absolute permeability, wettability, interfacial tension (IFT), and saturation history all influence kr values. These rock/fluid characteristics vary greatly from one reservoir region to the next, and it would be impossible to make kr measurements in all of them. The unsteady-state approach was used to calculate the relative permeability of five carbonate for core plugs from the Mishrif formation of WQ1. The relative permeability calculated by using Johnson, Bossler and Naumann (JBN) Correlation, which is, consider one of the unsteady-state approach where it found that the core plugs are water wet. A normalizing approach has been used to remove the effect of irreducible water and residual saturations, which would vary according on the environment. Based on their own irreducible water and trapped saturations, the relative permeabilities can subsequently be de-normalized and assigned to distinct sections (rock types) of the reservoir. The goal of this research is to normalize the relative permeability that was determined through water flooding.
Malware represents one of the dangerous threats to computer security. Dynamic analysis has difficulties in detecting unknown malware. This paper developed an integrated multi – layer detection approach to provide more accuracy in detecting malware. User interface integrated with Virus Total was designed as a first layer which represented a warning system for malware infection, Malware data base within malware samples as a second layer, Cuckoo as a third layer, Bull guard as a fourth layer and IDA pro as a fifth layer. The results showed that the use of fifth layers was better than the use of a single detector without merging. For example, the efficiency of the proposed approach is 100% compared with 18% and 63% of Virus Total and Bel
... Show MoreThe global food supply heavily depends on utilizing fertilizers to meet production goals. The adverse impacts of traditional fertilization practices on the environment have necessitated the exploration of new alternatives in the form of smart fertilizer technologies (SFTs). This review seeks to categorize SFTs, which are slow and controlled-release Fertilizers (SCRFs), nano fertilizers, and biological fertilizers, and describes their operational principles. It examines the environmental implications of conventional fertilizers and outlines the attributes of SFTs that effectively address these concerns. The findings demonstrate a pronounced environmental advantage of SFTs, including enhanced crop yields, minimized nutrient loss, improved nut
... Show MoreScheduling Timetables for courses in the big departments in the universities is a very hard problem and is often be solved by many previous works although results are partially optimal. This work implements the principle of an evolutionary algorithm by using genetic theories to solve the timetabling problem to get a random and full optimal timetable with the ability to generate a multi-solution timetable for each stage in the collage. The major idea is to generate course timetables automatically while discovering the area of constraints to get an optimal and flexible schedule with no redundancy through the change of a viable course timetable. The main contribution in this work is indicated by increasing the flexibility of generating opti
... Show MoreThe Internet of Things (IoT) is an information network that connects gadgets and sensors to allow new autonomous tasks. The Industrial Internet of Things (IIoT) refers to the integration of IoT with industrial applications. Some vital infrastructures, such as water delivery networks, use IIoT. The scattered topology of IIoT and resource limits of edge computing provide new difficulties to traditional data storage, transport, and security protection with the rapid expansion of the IIoT. In this paper, a recovery mechanism to recover the edge network failure is proposed by considering repair cost and computational demands. The NP-hard problem was divided into interdependent major and minor problems that could be solved in polynomial t
... Show MoreSansevieriatrifasciata was studied as a potential biosorbent for chromium, copper and nickel removal in batch process from electroplating and tannery effluents. Different parameters influencing the biosorption process such as pH, contact time, and amount of biosorbent were optimized while using the 80 mm sized particles of the biosorbent. As high as 91.3 % Ni and 92.7 % Cu were removed at pH of 6 and 4.5 respectively, while optimum Cr removal of 91.34 % from electroplating and 94.6 % from tannery effluents was found at pH 6.0 and 4.0 respectively. Pseudo second order model was found to best fit the kinetic data for all the metals as evidenced by their greater R2 values. FTIR characterization of biosorbent revealed the presence of carboxyl a
... Show MoreMultilocus haplotype analysis of candidate variants with genome wide association studies (GWAS) data may provide evidence of association with disease, even when the individual loci themselves do not. Unfortunately, when a large number of candidate variants are investigated, identifying risk haplotypes can be very difficult. To meet the challenge, a number of approaches have been put forward in recent years. However, most of them are not directly linked to the disease-penetrances of haplotypes and thus may not be efficient. To fill this gap, we propose a mixture model-based approach for detecting risk haplotypes. Under the mixture model, haplotypes are clustered directly according to their estimated d
Software-Defined Networking (SDN) has evolved network management by detaching the control plane from the data forwarding plane, resulting in unparalleled flexibility and efficiency in network administration. However, the heterogeneity of traffic in SDN presents issues in achieving Quality of Service (QoS) demands and efficiently managing network resources. SDN traffic flows are often divided into elephant flows (EFs) and mice flows (MFs). EFs, which are distinguished by their huge packet sizes and long durations, account for a small amount of total traffic but require disproportionate network resources, thus causing congestion and delays for smaller MFs. MFs, on the other hand, have a short lifetime and are latency-sensitive, but they accou
... Show More