The study aims to analyze computer textbooks content for preparatory stage according to the logical thinking. The researcher followed the descriptive analytical research approach (content analysis), and adopted an explicit idea during the analysis process. One of the content analysis tools which was designed based on mental processes employed during logical thinking has utilized to figure out the study results. The findings revealed that logical thinking skills formed (52%) in fourth preparatory textbook and (47%) in fifth preparatory textbook.
Two locally isolated microalgae (Chlorella vulgaris Bejerinck and Nitzschia palea (Kützing) W. Smith) were used in the current study to test their ability to production biodiesel through stimulated in different nitrogen concentration treatments (0, 2, 4, 8 gl ), and effect of nitrogen concentration on the quantity of primary product (carbohydrate, protein ), also the quantity and quality of lipid. The results revealed that starvation of nitrogen led to high lipid yielding, in C. vulgaris and N. palea the lipid content increased from 6.6% to 40% and 40% to 60% of dry weight (DW) respectively.Also in C. vulgaris, the highest carbohydrate was 23% of DW from zero nitrate medium and the highest protein was 50% of DW in the treatment 8gl. Whil
... Show Morethis paper presents a novel method for solving nonlinear optimal conrol problems of regular type via its equivalent two points boundary value problems using the non-classical
This paper proposes two hybrid feature subset selection approaches based on the combination (union or intersection) of both supervised and unsupervised filter approaches before using a wrapper, aiming to obtain low-dimensional features with high accuracy and interpretability and low time consumption. Experiments with the proposed hybrid approaches have been conducted on seven high-dimensional feature datasets. The classifiers adopted are support vector machine (SVM), linear discriminant analysis (LDA), and K-nearest neighbour (KNN). Experimental results have demonstrated the advantages and usefulness of the proposed methods in feature subset selection in high-dimensional space in terms of the number of selected features and time spe
... Show MoreAbiotic stress-induced genes may lead to understand the response of plants and adaptability to salinity and drought stresses. Differential display reverse transcriptase – polymerase chain reaction (DDRT-PCR) was used to investigate the differences in gene expression between drought- and salinity-stressed plantlets of Ruta graveolens. Direct and stepwise exposures to drought- or salt-responsive genes were screened in R. graveolens plantlets using the DDRT technique. Gene expression was investigated both in the control and in the salt or drought-stressed plantlets and differential banding patterns with different molecular sizes were observed using the primers OPA-01 (646,770 and 983 pb), OPA-08 (593 and 988 pb), OPA-11 (674 and 831 pb
... Show MoreThe gravity method is a measurement of relatively noticeable variations in the Earth’s gravitational field caused by lateral variations in rock's density. In the current research, a new technique is applied on the previous Bouguer map of gravity surveys (conducted from 1940–1950) of the last century, by selecting certain areas in the South-Western desert of Iraqi-territory within the provinces' administrative boundary of Najaf and Anbar. Depending on the theory of gravity inversion where gravity values could be reflected to density-contrast variations with the depths; so, gravity data inversion can be utilized to calculate the models of density and velocity from four selected depth-slices 9.63 Km, 1.1 Km, 0.682 Km and 0.407 Km.
... Show MoreTo ensure fault tolerance and distributed management, distributed protocols are employed as one of the major architectural concepts underlying the Internet. However, inefficiency, instability and fragility could be potentially overcome with the help of the novel networking architecture called software-defined networking (SDN). The main property of this architecture is the separation of the control and data planes. To reduce congestion and thus improve latency and throughput, there must be homogeneous distribution of the traffic load over the different network paths. This paper presents a smart flow steering agent (SFSA) for data flow routing based on current network conditions. To enhance throughput and minimize latency, the SFSA distrib
... Show MoreThe aim of this paper to find Bayes estimator under new loss function assemble between symmetric and asymmetric loss functions, namely, proposed entropy loss function, where this function that merge between entropy loss function and the squared Log error Loss function, which is quite asymmetric in nature. then comparison a the Bayes estimators of exponential distribution under the proposed function, whoever, loss functions ingredient for the proposed function the using a standard mean square error (MSE) and Bias quantity (Mbias), where the generation of the random data using the simulation for estimate exponential distribution parameters different sample sizes (n=10,50,100) and (N=1000), taking initial
... Show MoreThe survival analysis is one of the modern methods of analysis that is based on the fact that the dependent variable represents time until the event concerned in the study. There are many survival models that deal with the impact of explanatory factors on the likelihood of survival, including the models proposed by the world, David Cox, one of the most important and common models of survival, where it consists of two functions, one of which is a parametric function that does not depend on the survival time and the other a nonparametric function that depends on times of survival, which the Cox model is defined as a semi parametric model, The set of parametric models that depend on the time-to-event distribution parameters such as
... Show More