The dynamic behavior of laced reinforced concrete (LRC) T‐beams could give high‐energy absorption capabilities without significantly affecting the cost, which was offered through a combination of high strength and ductile response. In this paper, LRC T‐beams, composed of inclined continuous reinforcement on each side of the beam, were investigated to maintain high deformations as predicted in blast resistance. The beams were tested under four‐point loading to create pure bending zones and obtain the ultimate flexural capacities. Transverse reinforcement using lacing reinforcement and conventional vertical stirrups were compared in terms of deformation, strain, and toughness changes of the tested beams. The inclination angles of the used lacing reinforcement with respect to the longitudinal reinforcement were 45° and 60°. The lacing reinforcement was efficient and participated actively in resisting the bending moments and shear forces at the same time. For the same diameter of lacing reinforcement, the 60° inclination angle imposed more ductility before failure than beams with lacing reinforcement of a 45° inclination angle. Moreover, the lacing bar diameter was more effective in improving the load‐carrying capacities when using the inclination angle of 45°. A finite element (FE) model was developed and validated using the experimental results based on the measured deformations and strains to conduct a parametric study. The investigated parameters included the effect of the arrangements of the applied loads, laced rebar diameter, inclination angle, tension reinforcement ratio, and concrete strength.
In this paper, The transfer function model in the time series was estimated using different methods, including parametric Represented by the method of the Conditional Likelihood Function, as well as the use of abilities nonparametric are in two methods local linear regression and cubic smoothing spline method, This research aims to compare those capabilities with the nonlinear transfer function model by using the style of simulation and the study of two models as output variable and one model as input variable in addition t
... Show MoreThe denoising of a natural image corrupted by Gaussian noise is a problem in signal or image processing. Much work has been done in the field of wavelet thresholding but most of it was focused on statistical modeling of wavelet coefficients and the optimal choice of thresholds. This paper describes a new method for the suppression of noise in image by fusing the stationary wavelet denoising technique with adaptive wiener filter. The wiener filter is applied to the reconstructed image for the approximation coefficients only, while the thresholding technique is applied to the details coefficients of the transform, then get the final denoised image is obtained by combining the two results. The proposed method was applied by usin
... Show MoreForest fires continue to rise during the dry season and they are difficult to stop. In this case, high temperatures in the dry season can cause an increase in drought index that could potentially burn the forest every time. Thus, the government should conduct surveillance throughout the dry season. Continuous surveillance without the focus on a particular time becomes ineffective and inefficient because of preventive measures carried out without the knowledge of potential fire risk. Based on the Keetch-Byram Drought Index (KBDI), formulation of Drought Factor is used just for calculating the drought today based on current weather conditions, and yesterday's drought index. However, to find out the factors of drought a day after, the data
... Show MoreSteganography is a technique of concealing secret data within other quotidian files of the same or different types. Hiding data has been essential to digital information security. This work aims to design a stego method that can effectively hide a message inside the images of the video file. In this work, a video steganography model has been proposed through training a model to hiding video (or images) within another video using convolutional neural networks (CNN). By using a CNN in this approach, two main goals can be achieved for any steganographic methods which are, increasing security (hardness to observed and broken by used steganalysis program), this was achieved in this work as the weights and architecture are randomized. Thus,
... Show MoreUsing the Neural network as a type of associative memory will be introduced in this paper through the problem of mobile position estimation where mobile estimate its location depending on the signal strength reach to it from several around base stations where the neural network can be implemented inside the mobile. Traditional methods of time of arrival (TOA) and received signal strength (RSS) are used and compared with two analytical methods, optimal positioning method and average positioning method. The data that are used for training are ideal since they can be obtained based on geometry of CDMA cell topology. The test of the two methods TOA and RSS take many cases through a nonlinear path that MS can move through tha
... Show MoreAbstract
Due to the continuing demand for larger bandwidth, the optical transport becoming general in the access network. Using optical fiber technologies, the communications infrastructure becomes powerful, providing very high speeds to transfer a high capacity of data. Existing telecommunications infrastructures is currently widely used Passive Optical Network that apply Wavelength Division Multiplexing (WDM) and is awaited to play an important role in the future Internet supporting a large diversity of services and next generation networks. This paper presents a design of WDM-PON network, the simulation and analysis of transmission parameters in the Optisystem 7.0 environment for bidirectional traffic. The sim
... Show MoreAbstract
Black paint laser peening (bPLP) technique is currently applied for many engineering materials , especially for aluminum alloys due to high improvement in fatigue life and strength . Constant and variable bending fatigue tests have been performed at RT and stress ratio R= -1 . The results of the present work observed that the significance of the surface work hardening which generated high negative residual stresses in bPLP specimens .The fatigue life improvement factor (FLIF) for bPLP constant fatigue behavior was from 2.543 to 3.3 compared to untreated fatigue and the increase in fatigue strength at 107 cycle was 21% . The bPLP cumulative fatigue life behav
... Show MoreThe Gaussian orthogonal ensemble (GOE) version of the random matrix theory (RMT) has been used to study the level density following up the proton interaction with 44Ca, 48Ti and 56Fe.
A promising analysis method has been implemented based on the available data of the resonance spacing, where widths are associated with Porter Thomas distribution. The calculated level density for the compound nuclei 45Sc,49Vand 57Co shows a parity and spin dependence, where for Sc a discrepancy in level density distinguished from this analysis probably due to the spin misassignment .The present results show an acceptable agreement with the combinatorial method of level density.
... Show MoreCloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize
... Show More