A three-stage learning algorithm for deep multilayer perceptron (DMLP) with effective weight initialisation based on sparse auto-encoder is proposed in this paper, which aims to overcome difficulties in training deep neural networks with limited training data in high-dimensional feature space. At the first stage, unsupervised learning is adopted using sparse auto-encoder to obtain the initial weights of the feature extraction layers of the DMLP. At the second stage, error back-propagation is used to train the DMLP by fixing the weights obtained at the first stage for its feature extraction layers. At the third stage, all the weights of the DMLP obtained at the second stage are refined by error back-propagation. Network structures and values of learning parameters are determined through cross-validation, and test datasets unseen in the cross-validation are used to evaluate the performance of the DMLP trained using the three-stage learning algorithm. Experimental results show that the proposed method is effective in combating overfitting in training deep neural networks.
This paper aims to improve the voltage profile using the Static Synchronous Compensator (STATCOM) in the power system in the Kurdistan Region for all weak buses. Power System Simulation studied it for Engineers (PSS\E) software version 33.0 to apply the Newton-Raphson (NR) method. All bus voltages were recorded and compared with the Kurdistan region grid index (0.95≤V ≤1.05), simulating the power system and finding the optimal size and suitable location of Static Synchronous Compensator (STATCOM)for bus voltage improvement at the weakest buses. It shows that Soran and New Koya substations are the best placement for adding STATCOM with the sizes 20 MVAR and 40 MVAR. After adding STATCOM with the sizes [20MVAR and 40MV
... Show MoreThe study aims at investigating the effectiveness of the Virtual Library Technology, in developing the achievement of the English Language Skills in the Center of Development and Continuous Education, in comparison with the individual learning via personal computer to investigate the students' attitude towards the use of both approaches. The population of the study includes the participants in the English Language course arranged in the Center. The sample includes 60 students who were randomly chosen from the whole population (participants in English Courses for the year 2009-2010). The sample is randomly chosen and divided into two experimental groups. The first group has learned through classroom technology; while the other group has l
... Show MoreThe effects of T-shaped fins on the improvement of phase change materials (PCM) melting are numerically investigated in vertical triple-tube storage containment. The PCM is held in the middle pipe of a triple-pipe heat exchanger while the heat transfer fluid flows through the internal and external pipes. The dimension effects of the T-shaped fins on the melting process of the PCM are investigated to determine the optimum case. Results indicate that while using T-shaped fins improves the melting performance of the PCM, the improvement potential is mainly governed by the fin’s body rather than the head. Hence, the proposed T-shaped fin did not noticeably improve melting at the bottom of the PCM domain; additionally, a flat fin is ad
... Show MoreAbstract
Hexapod robot is a flexible mechanical robot with six legs. It has the ability to walk over terrain. The hexapod robot look likes the insect so it has the same gaits. These gaits are tripod, wave and ripple gaits. Hexapod robot needs to stay statically stable at all the times during each gait in order not to fall with three or more legs continuously contacts with the ground. The safety static stability walking is called (the stability margin). In this paper, the forward and inverse kinematics are derived for each hexapod’s leg in order to simulate the hexapod robot model walking using MATLAB R2010a for all gaits and the geometry in order to derive the equations of the sub-constraint workspaces for each
... Show MoreSolid waste is a major issue in today's world. Which can be a contributing factor to pollution and the spread of vector-borne diseases. Because of its complicated nonlinear processes, this problem is difficult to model and optimize using traditional methods. In this study, a mathematical model was developed to optimize the cost of solid waste recycling and management. In the optimization phase, the salp swarm algorithm (SSA) is utilized to determine the level of discarded solid waste and reclaimed solid waste. An optimization technique SSA is a new method of finding the ideal solution for a mathematical relationship based on leaders and followers. It takes a lot of random solutions, as well as their outward or inward fluctuations, t
... Show MoreNurse scheduling problem is one of combinatorial optimization problems and it is one of NP-Hard problems which is difficult to be solved as optimal solution. In this paper, we had created an proposed algorithm which it is hybrid simulated annealing algorithm to solve nurse scheduling problem, developed the simulated annealing algorithm and Genetic algorithm. We can note that the proposed algorithm (Hybrid simulated Annealing Algorithm(GS-h)) is the best method among other methods which it is used in this paper because it satisfied minimum average of the total cost and maximum number of Solved , Best and Optimal problems. So we can note that the ratios of the optimal solution are 77% for the proposed algorithm(GS-h), 28.75% for Si
... Show MoreFor several applications, it is very important to have an edge detection technique matching human visual contour perception and less sensitive to noise. The edge detection algorithm describes in this paper based on the results obtained by Maximum a posteriori (MAP) and Maximum Entropy (ME) deblurring algorithms. The technique makes a trade-off between sharpening and smoothing the noisy image. One of the advantages of the described algorithm is less sensitive to noise than that given by Marr and Geuen techniques that considered to be the best edge detection algorithms in terms of matching human visual contour perception.