A new concrete rheometer is introduced including its innovation, actual design, working rules,
calibration, and reliability. A modified design of Tattersall two-point device is created. Some of
components are purchased from local and foreign markets, while other components and the
manufacturing process are locally fabricated. The matching viscosity method of determining the mixer
viscometer constants is demonstrated and followed to relate torque and rotational speed to yield stress
and viscosity (Bingham parameters). The calibration procedures and its calculation are explained.
Water is used as a Newtonian fluid, while; cement paste (cement + water) with w/c ratio equal to
(0.442) is used as a non-Newtonian fluid. The cement paste is tested in “Petroleum Research and
Development Center” by “OFITE Model 800 Viscometer”. In order to verify the reliability of the new
rheometer, an Artificial Neural Network (ANN) model with a well selected bank of data is constructed;
and (16) Mixes of Self Compacting Concrete (SCC) are constructed, mixed and tested by the new
Rheometer. The results from model (predicted) and those from the experimental work (measured) were
found to have very good degrees of correlation and matching, which indicates that the new rheometer
can be reliable.
Polyacrylonitrile nanofiber (PANFS), a well-known polymers, has been extensively employed in the manufacturing of carbon nanofibers (CNFS), which have recently gained substantial attention due to their excellent features, such as spinnability, environmental friendliness, and commercial feasibility. Because of their high carbon yield and versatility in tailoring the final CNFS structure, In addition to the simple formation of ladder structures through nitrile polymerization to yield stable products, CNFS and PAN have been the focus of extensive research as potential production precursors. For instance, the development of biomedical and high-performance composites has now become achievable. PAN homopolymer or PAN-based precursor copol
... Show MoreThis paper proposes a new structure for a Fractional Order Sliding Mode Controller (FOSMC) to control a Twin Rotor Aerodynamic System (TRAS). The new structure is composed by defining two 3-dimensional sliding mode surfaces for the TRAS model and introducing fractional order derivative integral in the state variables as well as in the control action. The parameters of the controller are determined so as to minimize the Integral of Time multiplied by Absolute Error (ITAE) performance index. Through comparison, this controller outperforms its integer counterpart in many specifications, such as reducing the delay time, rise time, percentage overshoot, settling time, time to reach the sliding surface, and amplitude of chattering in control inpu
... Show MoreThe performance of a solar assisted desiccant cooling system for a meeting-hall located in the College of Engineering/University of Baghdad was evaluated theoretically. The system was composed of four components; a solar air heater, a desiccant dehumidifier, a heat exchanger and an evaporative cooler. A computer simulation was developed by using MATLAB to assess the effect of various design and operating conditions on the performance of the system and its components. The actual weather data on recommended days were used to assess the load variation and the system performance during those days. The radiant time series method (RTS) was used to evaluate the hourly variation of the cooling load. Four operation modes were employed for perform
... Show MoreMost of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve B
... Show MoreThis study employs evolutionary optimization and Artificial Intelligence algorithms to determine an individual’s age using a single-faced image as the basis for the identification process. Additionally, we used the WIKI dataset, widely considered the most comprehensive collection of facial images to date, including descriptions of age and gender attributes. However, estimating age from facial images is a recent topic of study, even though much research has been undertaken on establishing chronological age from facial photographs. Retrained artificial neural networks are used for classification after applying reprocessing and optimization techniques to achieve this goal. It is possible that the difficulty of determining age could be reduce
... Show MoreComputations of the relative permeability curves were made through their representation by two functions for wetting and nonwetting phases. Each function contains one parameter that controls the shape of the relative permeability curves. The values of these parameters are chosen to minimize an objective function, that is represented as a weighted sum of the squared differences between experimentally measured data and the corresponding data calculated by a mathematical model simulating the experiment. These data comprise the pressure drop across core samples and the recovery response of the displacing phase. Two mathematical models are constructed in this study to simulate incompressible, one-dimensional, two-phase flow. The first model d
... Show MoreAs a result of the significance of image compression in reducing the volume of data, the requirement for this compression permanently necessary; therefore, will be transferred more quickly using the communication channels and kept in less space in memory. In this study, an efficient compression system is suggested; it depends on using transform coding (Discrete Cosine Transform or bi-orthogonal (tap-9/7) wavelet transform) and LZW compression technique. The suggested scheme was applied to color and gray models then the transform coding is applied to decompose each color and gray sub-band individually. The quantization process is performed followed by LZW coding to compress the images. The suggested system was applied on a set of seven stand
... Show MoreChoosing antimicrobials is a common dilemma when the expected rate of bacterial resistance is high. The observed resistance values in unequal groups of isolates tested for different antimicrobials can be misleading. This can affect the decision to recommend one antibiotic over the other. We analyzed recalled data with the statistical consideration of unequal sample groups. Data was collected concerning children suspected to have typhoid fever at Al Alwyia Pediatric Teaching Hospital in Baghdad, Iraq. The study period extended from September 2021 to September 2022. A novel algorithm was developed to compare the drug sensitivity among unequal numbers of Salmonella typhi (S. Typhi) isolates tested with different antibacterials.
... Show MoreBreast cancer constitutes about one fourth of the registered cancer cases among the Iraqi population (1)
and it is the leading cause of death among Iraqi women (2)
. Each year more women are exposed to the vicious
ramifications of this disease which include death if left unmanaged or the negative sequels that they would
experience, cosmetically and psychologically, after exposure to radical mastectomy.
The World Health Organization (WHO) documented that early detection and screening, when coped
with adequate therapy, could offer a reduction in breast cancer mortality; displaying that the low survival rates
in less developed countries, including Iraq, is mainly attributed to the lack of early detection programs couple
In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me