In this study, iron was coupled with copper to form a bimetallic compound through a biosynthetic method, which was then used as a catalyst in the Fenton-like processes for removing direct Blue 15 dye (DB15) from aqueous solution. Characterization techniques were applied on the resultant nanoparticles such as SEM, BET, EDAX, FT-IR, XRD, and zeta potential. Specifically, the rounded and shaped as spherical nanoparticles were found for green synthesized iron/copper nanoparticles (G-Fe/Cu NPs) with the size ranging from 32-59 nm, and the surface area was 4.452 m2/g. The effect of different experimental factors was studied in both batch and continuous experiments. These factors were H2O2 concentration, G-Fe/CuNPs amount, pH, initial DB15
... Show MoreTrue random number generators are essential components for communications to be conconfidentially secured. In this paper a new method is proposed to generate random sequences of numbers based on the difference of the arrival times of photons detected in a coincidence window between two single-photon counting modules
Reverse Osmosis (RO) has already proved its worth as an efficient treatment method in chemical and environmental engineering applications. Various successful RO attempts for the rejection of organic and highly toxic pollutants from wastewater can be found in the literature over the last decade. Dimethylphenol is classified as a high-toxic organic compound found ubiquitously in wastewater. It poses a real threat to humans and the environment even at low concentration. In this paper, a model based framework was developed for the simulation and optimisation of RO process for the removal of dimethylphenol from wastewater. We incorporated our earlier developed and validated process model into the Species Conserving Genetic Algorithm (SCG
... Show MoreThe aim of this research is to show the level of banking leadership trends in the importance of organizational confidence and its reflection in the dimensions of the strategic position the research was applied in a bank (Al- Mansour Investment, Business Bay), and the Questionnaire Was adopted as a tool to collect data and information from the number of the sample (15) who are in allocation (Department Manager, Department Manager Department Manager, Division Officer, Unit Officer) and he used the statistical program (spss) to calculate (standard deviation, arithmetic mean, percentages, regression, analysis,F-test coefficient of determination R2, Coefficieent, square Kay and the research reached a number of conclusions, the mos
... Show MoreThe catalytic activity of faujasite type NaY catalysts prepared from local clay (kaolin) with different Si/Al ratio was studied using cumene cracking as a model for catalytic cracking process in the temperature range of 450-525° C, weight hourly space velocity (WHSV) of 5-20 h1, particle size ≤75μm and atmospheric pressure. The catalytic activity was investigated using experimental laboratory plant scale of fluidized bed reactor.
It was found that the cumene conversion increases with increasing temperature and decreasing WHSV. At 525° C and WHSV 5 h-1, the conversion was 42.36 and 35.43 mol% for catalyst with 3.54 Si/Al ratio and Catalyst with 5.75 Si/Al ratio, respectively, while at 450° C and at the same WHSV, the conversion w
Abstract: Stars whose initial masses are between (0.89 - 8.0) M☉ go through an Asymptotic Giant Branch (AGB) phase at the end of their life. Which have been evolved from the main sequence phase through Asymptotic Giant Branch (AGB). The calculations were done by adopted Synthetic Model showed the following results: 1- Mass loss on the AGB phase consists of two phases for period (P <500) days and for (P>500) days; 2- the mass loss rate exponentially increases with the pulsation periods; 3- The expansion velocity VAGB for our stars are calculated according to the three assumptions; 4- the terminal velocity depends on several factors likes metallicity and luminosity. The calculations indicated that a super wind phase (S.W) developed on the A
... Show MoreVariable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage
... Show MoreIn this work, analytical study for simulating a Fabry-Perot bistable etalon (F-P cavity) filled with a dispersive optimized nonlinear optical material (Kerr type) such as semiconductors Indium Antimonide (InSb). Because of a trade off between the etalon finesse values and driving terms, an optimization procedures have been done on the InSb etalon/CO laser parameters, using critical switching irradiance (Ic) via simulation systems of optimization procedures of optical cavity. in order to achieve the minimum switching power and faster switching time, the optimization parameters of the finesse values and driving terms on optical bistability and switching dynamics must be studied.
... Show MoreThe current study examined the impact of using PowerPoint presentation on EFL student’s attendance, achievement and engagement. To achieve the aim of this study, three null hypotheses have been posed as follows: There is no statistically significant difference between the mean score of the experimental group attendance and that of the control one; there is no statistically significant difference between the mean score of the experimental group achievement and that of the control one, and there is no statistically significant difference between the mean score of the experimental group engagement and that of the control one. To verify a hypothesis, a sample of sixty students is chosen randomly from the third year, department of English,
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show More