The purpose of current study is to analyze the computer textbooks content for intermediate stage in Iraq according to the theory of multiple intelligence. By answering the following question “what is the percentage of availability of multiple intelligence in the content of the computer textbooks on intermediate stage (grade I, II) for the academic year (2017-2018)? The researcher followed the descriptive analytical research approach (content analysis), and adopted an explicit idea for registration. The research tool was prepared according the Gardner’s classification of multiple intelligence. It has proven validity and reliability. The study found the percentage of multiple intelligence in the content of computer textbooks for the intermediate stages (grade I, II) separately (40%), (59%) respectively, collectively (66.67%).
Rutting is a crucial concern impacting asphalt concrete pavements’ stability and long-term performance, negatively affecting vehicle drivers’ comfort and safety. This research aims to evaluate the permanent deformation of pavement under different traffic and environmental conditions using an Artificial Neural Network (ANN) prediction model. The model was built based on the outcomes of an experimental uniaxial repeated loading test of 306 cylindrical specimens. Twelve independent variables representing the materials’ properties, mix design parameters, loading settings, and environmental conditions were implemented in the model, resulting in a total of 3214 data points. The network accomplished high prediction accuracy with an R
... Show MoreThe current research aims to determine the intellectual security and the psychological resilience of Secondary school students and how these two variables are related to each other. The study also seeks the extent to which psychological resilience contributes to intellectual security
The research sample consisted of (420) students from the Secondary stage in the Directorate of Education of Baghdad / Rusafa III. Two scales were administered to the participants to collect the needed data. As for the analysis of data, Pearson correlation coefficient, T-test, and the Regression analysis were employed, the results revealed:
- The members of the sample have an intellectual Security.
- The members of the sample have
The aim of this paper is to estimate the concentrations of some heavy metals in Mohammed AL-Qassim Highway in Baghdad city for different distances by using the polynomial interpolation method for functions passing from the data, which is proposed by using the MATLAB software. The sample soil in this paper was taken from the surface layer (0-25 cm depth) at the two sides of the road with four distances (1.5, 10, 25 and 60 m) in each side of the road. Using this method, we can find the concentrations of heavy metals in the soil at any depth and time without using the laboratory, so this method reduces the time, effort and costs of conducting laboratory analyzes.
Two locally isolated microalgae (Chlorella vulgaris Bejerinck and Nitzschia palea (Kützing) W. Smith) were used in the current study to test their ability to production biodiesel through stimulated in different nitrogen concentration treatments (0, 2, 4, 8 gl ), and effect of nitrogen concentration on the quantity of primary product (carbohydrate, protein ), also the quantity and quality of lipid. The results revealed that starvation of nitrogen led to high lipid yielding, in C. vulgaris and N. palea the lipid content increased from 6.6% to 40% and 40% to 60% of dry weight (DW) respectively.Also in C. vulgaris, the highest carbohydrate was 23% of DW from zero nitrate medium and the highest protein was 50% of DW in the treatment 8gl. Whil
... Show Morethis paper presents a novel method for solving nonlinear optimal conrol problems of regular type via its equivalent two points boundary value problems using the non-classical
This paper proposes two hybrid feature subset selection approaches based on the combination (union or intersection) of both supervised and unsupervised filter approaches before using a wrapper, aiming to obtain low-dimensional features with high accuracy and interpretability and low time consumption. Experiments with the proposed hybrid approaches have been conducted on seven high-dimensional feature datasets. The classifiers adopted are support vector machine (SVM), linear discriminant analysis (LDA), and K-nearest neighbour (KNN). Experimental results have demonstrated the advantages and usefulness of the proposed methods in feature subset selection in high-dimensional space in terms of the number of selected features and time spe
... Show MoreAbiotic stress-induced genes may lead to understand the response of plants and adaptability to salinity and drought stresses. Differential display reverse transcriptase – polymerase chain reaction (DDRT-PCR) was used to investigate the differences in gene expression between drought- and salinity-stressed plantlets of Ruta graveolens. Direct and stepwise exposures to drought- or salt-responsive genes were screened in R. graveolens plantlets using the DDRT technique. Gene expression was investigated both in the control and in the salt or drought-stressed plantlets and differential banding patterns with different molecular sizes were observed using the primers OPA-01 (646,770 and 983 pb), OPA-08 (593 and 988 pb), OPA-11 (674 and 831 pb
... Show MoreThe gravity method is a measurement of relatively noticeable variations in the Earth’s gravitational field caused by lateral variations in rock's density. In the current research, a new technique is applied on the previous Bouguer map of gravity surveys (conducted from 1940–1950) of the last century, by selecting certain areas in the South-Western desert of Iraqi-territory within the provinces' administrative boundary of Najaf and Anbar. Depending on the theory of gravity inversion where gravity values could be reflected to density-contrast variations with the depths; so, gravity data inversion can be utilized to calculate the models of density and velocity from four selected depth-slices 9.63 Km, 1.1 Km, 0.682 Km and 0.407 Km.
... Show MoreTo ensure fault tolerance and distributed management, distributed protocols are employed as one of the major architectural concepts underlying the Internet. However, inefficiency, instability and fragility could be potentially overcome with the help of the novel networking architecture called software-defined networking (SDN). The main property of this architecture is the separation of the control and data planes. To reduce congestion and thus improve latency and throughput, there must be homogeneous distribution of the traffic load over the different network paths. This paper presents a smart flow steering agent (SFSA) for data flow routing based on current network conditions. To enhance throughput and minimize latency, the SFSA distrib
... Show More