Copper electrodeposition by electrorefining process in acidic sulfate media contains 40 g/l of cupric ions and 160 g/l of sulfuric acid was achieved to study the influence of the operating parameters on cathode purity, surface morphology, deposition rate, current efficiency and power consumption. These operating parameters and there ranges are: current density 200, 300 and 400 A/m2, electrolyte temperature 35, 50 and 65 oC, electrodes spacing 15, 30 and 45 mm and electrolyte residence time 6, 4 and 2 h were utilized. XRF, SEM and EDX analyses were attained to clarify the properties of the produced cathode.
ABSTRACT:
Microencapsulation is used to modify and retard drug release as well as to overcome the unpleasant effect
(gastrointestinal disturbances) which are associated with repeated and overdose of ibuprofen per day.
So that, a newly developed method of microencapsulation was utilized (a modified organic method) through a
modification of aqueous colloidal polymer dispersion method using ethylcellulose and sodium alginate coating materials to
prepare a sustained release ibuprofen microcapsules.
The effect of core : wall ratio on the percent yield and encapsulation efficiency of prepared microcapsules was low, whereas
, the release of drug from prepared microcapsules was affected by core: wall ratio ,proportion of coa
The design of reinforced concrete spread foundations mainly depends on soil bearing capacity, loading value, and column size. So for each design case, tiresome calculations and time consumption are needed. In this paper, generalized design charts are presented and plotted according to derivations based on the ACI 318 M-2019 Code. These charts could be used directly by the structural designers to estimate the column size, foundation thickness, and dimensions as well as the foundation reinforcement under a certain given concentric load assuming a uniformly distributed contact pressure underneath the foundation. Of noteworthy, these charts are oriented to deal with square isolated footings with a square concentric column, covering reasonable r
... Show MoreClassification of imbalanced data is an important issue. Many algorithms have been developed for classification, such as Back Propagation (BP) neural networks, decision tree, Bayesian networks etc., and have been used repeatedly in many fields. These algorithms speak of the problem of imbalanced data, where there are situations that belong to more classes than others. Imbalanced data result in poor performance and bias to a class without other classes. In this paper, we proposed three techniques based on the Over-Sampling (O.S.) technique for processing imbalanced dataset and redistributing it and converting it into balanced dataset. These techniques are (Improved Synthetic Minority Over-Sampling Technique (Improved SMOTE), Border
... Show MoreThe study aims to build a proposed training program for school leaders in the Sultanate of Oman on the planning practices of the Kaufman model in light of the needs and challenges of reality. It also aims to identify the challenges facing school leaders in practicing the stages of strategic planning. To achieve these objectives, the study adopted the descriptive approach due to its suitability to the nature of the study. A questionnaire was used to collect the needed data. The study sample included (225) individuals from school principals, their assistants and senior teachers in post-basic education in the Sultanate of Oman. After processing the data statistically, the study concluded that the reality of planning practices for school lea
... Show MoreA Modified version of the Generlized standard addition method ( GSAM) was developed. This modified version was used for the quantitative determination of arginine (Arg) and glycine ( Gly) in arginine acetyl salicylate – glycine complex . According to this method two linear equations were solved to obtain the amounts of (Arg) and (Gly). The first equation was obtained by spectrophotometic measurement of the total absorbance of (Arg) and (Gly) colored complex with ninhydrin . The second equation was obtained by measuring the total acid consumed by total amino groups of (Arg) and ( Gly). The titration was carried out in non- aqueous media using perchloric acid in glacial acetic acid as a titrant. The developed metho
... Show MoreMost of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve B
... Show MoreIn this paper , an efficient new procedure is proposed to modify third –order iterative method obtained by Rostom and Fuad [Saeed. R. K. and Khthr. F.W. New third –order iterative method for solving nonlinear equations. J. Appl. Sci .7(2011): 916-921] , using three steps based on Newton equation , finite difference method and linear interpolation. Analysis of convergence is given to show the efficiency and the performance of the new method for solving nonlinear equations. The efficiency of the new method is demonstrated by numerical examples.
In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me