Box-Wilson experimental design method was employed to optimized lead ions removal efficiency by bulk liquid membrane (BLM) method. The optimization procedure was primarily based on four impartial relevant parameters: pH of feed phase (4-6), pH of stripping phase (9-11), carrier concentration TBP (5-10) %, and initial metal concentration (60-120 ppm). maximum recovery efficiency of lead ions is 83.852% was virtually done following thirty one-of-a-kind experimental runs, as exact through 24-Central Composite Design (CCD). The best values for the aforementioned four parameters, corresponding to the most restoration efficiency were: 5, 10, 7.5% (v/v), and 90 mg/l, respectively. The obtained experimental data had been utilized to strengthen a semi-empirical model, based on a second-degree polynomial, to predict recovery efficiency. The model was tested using ANOVA software (Design expert®) and found acceptable R-Squared were (0.9673). Yield responseurface and contour plots have been created using the developed model, which revealed the presence of high-recovery plateaus whose specs will be useful in controlling pilot or industrial scale future devices to ensure economic feasibility.
This article proposes a new strategy based on a hybrid method that combines the gravitational search algorithm (GSA) with the bat algorithm (BAT) to solve a single-objective optimization problem. It first runs GSA, followed by BAT as the second step. The proposed approach relies on a parameter between 0 and 1 to address the problem of falling into local research because the lack of a local search mechanism increases intensity search, whereas diversity remains high and easily falls into the local optimum. The improvement is equivalent to the speed of the original BAT. Access speed is increased for the best solution. All solutions in the population are updated before the end of the operation of the proposed algorithm. The diversification f
... Show MoreThe unpredictable and huge data generation nowadays by smart computing devices like (Sensors, Actuators, Wi-Fi routers), to handle and maintain their computational processing power in real time environment by centralized cloud platform is difficult because of its limitations, issues and challenges, to overcome these, Cisco introduced the Fog computing paradigm as an alternative for cloud-based computing. This recent IT trend is taking the computing experience to the next level. It is an extended and advantageous extension of the centralized cloud computing technology. In this article, we tried to highlight the various issues that currently cloud computing is facing. Here
... Show MoreMost Internet of Vehicles (IoV) applications are delay-sensitive and require resources for data storage and tasks processing, which is very difficult to afford by vehicles. Such tasks are often offloaded to more powerful entities, like cloud and fog servers. Fog computing is decentralized infrastructure located between data source and cloud, supplies several benefits that make it a non-frivolous extension of the cloud. The high volume data which is generated by vehicles’ sensors and also the limited computation capabilities of vehicles have imposed several challenges on VANETs systems. Therefore, VANETs is integrated with fog computing to form a paradigm namely Vehicular Fog Computing (VFC) which provide low-latency services to mo
... Show MoreIdentification of complex communities in biological networks is a critical and ongoing challenge since lots of network-related problems correspond to the subgraph isomorphism problem known in the literature as NP-hard. Several optimization algorithms have been dedicated and applied to solve this problem. The main challenge regarding the application of optimization algorithms, specifically to handle large-scale complex networks, is their relatively long execution time. Thus, this paper proposes a parallel extension of the PSO algorithm to detect communities in complex biological networks. The main contribution of this study is summarized in three- fold; Firstly, a modified PSO algorithm with a local search operator is proposed
... Show MoreConstruction contractors usually undertake multiple construction projects simultaneously. Such a situation involves sharing different types of resources, including monetary, equipment, and manpower, which may become a major challenge in many cases. In this study, the financial aspects of working on multiple projects at a time are addressed and investigated. The study considers dealing with financial shortages by proposing a multi-project scheduling optimization model for profit maximization, while minimizing the total project duration. Optimization genetic algorithm and finance-based scheduling are used to produce feasible schedules that balance the finance of activities at any time w
This paper aims to decide the best parameter estimation methods for the parameters of the Gumbel type-I distribution under the type-II censorship scheme. For this purpose, classical and Bayesian parameter estimation procedures are considered. The maximum likelihood estimators are used for the classical parameter estimation procedure. The asymptotic distributions of these estimators are also derived. It is not possible to obtain explicit solutions of Bayesian estimators. Therefore, Markov Chain Monte Carlo, and Lindley techniques are taken into account to estimate the unknown parameters. In Bayesian analysis, it is very important to determine an appropriate combination of a prior distribution and a loss function. Therefore, two different
... Show More