New trends in teaching and learning theory are considered a theoretical axis
from which came the background that depends on any source, or practice sample or
teaching plane, accuracy and simplicity prevent the development of the teaching
process. Many attempts have come to scene to illuminate the teaching background,
but they have not exceed those remarkable patterns and methods. Thus, the
appearance of the teaching theory have been hindered.
This led to the need for research and development in the field of teaching to
find out a specific teaching theory according to the modern trends and concepts.
Teaching is regarded a humanitarian process which aims at helping those who
want to acquire knowledge, since teaching is an intended activity. Education is the
process of acquiring knowledge, skills and trends by the person who wants to learn
himself / herself. Accordingly, learning is a principal branch of teaching, because it is
considered one of the varied methods in carrying out the teaching process. From this
fact did the need to use a good theory as a guide to the later researches come. Its value
depends highly on the studies and researches it produces to help the researcher find a
way that direct him to discover new aspects.
* The Research purpose:
This research aims at knowing the new trends and methods in the theory of
teaching and learning.
* The Research Boundary:
This research is limited to: finding out the theory of teaching and learning and
putting a balance between them. In addition to discovering the features and methods
of building the teaching theory. Moreover, it aims at putting a limit to the role played
by the theory in the teaching and learning process.
Specifying terminology:
Terms concerning teaching and learning are specified in the research itself.
KE Sharquie, SA Al-Mashhadani, AA Noaimi, WB Al-Zoubaidi, Our Dermatology Online/Nasza Dermatologia Online, 2015 - Cited by 10
Many patients with advanced type 2 diabetes mellitus (T2DM) and all patients with T1DM require insulin to keep blood glucose levels in the target range. The most common route of insulin administration is subcutaneous insulin injections. There are many ways to deliver insulin subcutaneously, such as vials and syringes, insulin pens, and insulin pumps. Though subcutaneous insulin delivery is the standard route of insulin administration, it is associated with injection pain, needle phobia, lipodystrophy, noncompliance, and peripheral hyperinsulinemia. Therefore, the need exists to deliver insulin in a minimally invasive or noninvasive way and in the most physiological way. Inhaled insulin was the first approved noninvasive and alternative way
... Show MoreThree complexes of copper(II) and iron(II) with mixed ligands acetylacetonebis(thio-semicarbazone)- ABTSH2 and benzaldazine- BA have been prepared and characterized using different physico-chemical techniques including the determination of metal contents, mole-cular weight, measurement of molar conductivity, magnetic moment, molar refraction, infrared and electronic spectra. Accordingly, octahedral complexes having general formulaes [Cu2(ABTSH2)2(BA)2Cl2]Cl2 and [M2(ABTSH2)2(BA)2(SO4)2] {M= Cu(II) or (Fe(II)} have been proposed. The resulted complexes screened for antifungal activity in vitro against the citrus pathogen Aspergillus niger and Fusarium sp. which caused root rot of sugar and the beans pathogen Alternaria sp. All the complex
... Show MoreThe aim of this paper is to propose an efficient three steps iterative method for finding the zeros of the nonlinear equation f(x)=0 . Starting with a suitably chosen , the method generates a sequence of iterates converging to the root. The convergence analysis is proved to establish its five order of convergence. Several examples are given to illustrate the efficiency of the proposed new method and its comparison with other methods.
This paper is concerned with introducing and studying the first new approximation operators using mixed degree system and second new approximation operators using mixed degree system which are the core concept in this paper. In addition, the approximations of graphs using the operators first lower and first upper are accurate then the approximations obtained by using the operators second lower and second upper sincefirst accuracy less then second accuracy. For this reason, we study in detail the properties of second lower and second upper in this paper. Furthermore, we summarize the results for the properties of approximation operators second lower and second upper when the graph G is arbitrary, serial 1, serial 2, reflexive, symmetric, tra
... Show MoreThis work comprises the synthesis of new phenoxazine derivatives containing N-substituted phenoxazine starting from phenoxazine (1). Synthesis of ethyl acetate phenoxazine (2) through the reaction of phenoxazine with ethylchloroacetate, which reacted with hydrazine hydrate to give 10-aceto hydrazide phenoxazine (3), then reacted with formic acid to give 10-[N-formyl acetohydrazide] phenoxazine (4). Reaction of compound (4) with phosphorous pentaoxide or phosphorus pentasulphide to gave 10-[N-methylene-1,3,4-oxadiazole] phenoxazine (5) and 10-[N-methylene-1,3,4-thiadiazole] phenoxazine (6).
Merging biometrics with cryptography has become more familiar and a great scientific field was born for researchers. Biometrics adds distinctive property to the security systems, due biometrics is unique and individual features for every person. In this study, a new method is presented for ciphering data based on fingerprint features. This research is done by addressing plaintext message based on positions of extracted minutiae from fingerprint into a generated random text file regardless the size of data. The proposed method can be explained in three scenarios. In the first scenario the message was used inside random text directly at positions of minutiae in the second scenario the message was encrypted with a choosen word before ciphering
... Show MoreThis paper presents a combination of enhancement techniques for fingerprint images affected by different type of noise. These techniques were applied to improve image quality and come up with an acceptable image contrast. The proposed method included five different enhancement techniques: Normalization, Histogram Equalization, Binarization, Skeletonization and Fusion. The Normalization process standardized the pixel intensity which facilitated the processing of subsequent image enhancement stages. Subsequently, the Histogram Equalization technique increased the contrast of the images. Furthermore, the Binarization and Skeletonization techniques were implemented to differentiate between the ridge and valley structures and to obtain one
... Show MoreIn this study, we have created a new Arabic dataset annotated according to Ekman’s basic emotions (Anger, Disgust, Fear, Happiness, Sadness and Surprise). This dataset is composed from Facebook posts written in the Iraqi dialect. We evaluated the quality of this dataset using four external judges which resulted in an average inter-annotation agreement of 0.751. Then we explored six different supervised machine learning methods to test the new dataset. We used Weka standard classifiers ZeroR, J48, Naïve Bayes, Multinomial Naïve Bayes for Text, and SMO. We also used a further compression-based classifier called PPM not included in Weka. Our study reveals that the PPM classifier significantly outperforms other classifiers such as SVM and N
... Show More