The dynamic behavior of laced reinforced concrete (LRC) T‐beams could give high‐energy absorption capabilities without significantly affecting the cost, which was offered through a combination of high strength and ductile response. In this paper, LRC T‐beams, composed of inclined continuous reinforcement on each side of the beam, were investigated to maintain high deformations as predicted in blast resistance. The beams were tested under four‐point loading to create pure bending zones and obtain the ultimate flexural capacities. Transverse reinforcement using lacing reinforcement and conventional vertical stirrups were compared in terms of deformation, strain, and toughness changes of the tested beams. The inclination angles of the used lacing reinforcement with respect to the longitudinal reinforcement were 45° and 60°. The lacing reinforcement was efficient and participated actively in resisting the bending moments and shear forces at the same time. For the same diameter of lacing reinforcement, the 60° inclination angle imposed more ductility before failure than beams with lacing reinforcement of a 45° inclination angle. Moreover, the lacing bar diameter was more effective in improving the load‐carrying capacities when using the inclination angle of 45°. A finite element (FE) model was developed and validated using the experimental results based on the measured deformations and strains to conduct a parametric study. The investigated parameters included the effect of the arrangements of the applied loads, laced rebar diameter, inclination angle, tension reinforcement ratio, and concrete strength.
Home New Trends in Information and Communications Technology Applications Conference paper Audio Compression Using Transform Coding with LZW and Double Shift Coding Zainab J. Ahmed & Loay E. George Conference paper First Online: 11 January 2022 126 Accesses Part of the Communications in Computer and Information Science book series (CCIS,volume 1511) Abstract The need for audio compression is still a vital issue, because of its significance in reducing the data size of one of the most common digital media that is exchanged between distant parties. In this paper, the efficiencies of two audio compression modules were investigated; the first module is based on discrete cosine transform and the second module is based on discrete wavelet tr
... Show MoreThis research deals with the use of a number of statistical methods, such as the kernel method, watershed, histogram, and cubic spline, to improve the contrast of digital images. The results obtained according to the RSME and NCC standards have proven that the spline method is the most accurate in the results compared to other statistical methods.
The purpose of this research is to test the ability of the true strength index To time and manage trading in the financial market to select the best stocks and achieve a higher return than the Simple buy and hold strategy. And To achieve the objectives of the research, it relied on the main hypothesis, which is By using the True Strength Index to manage trading decisions buying and selling, can be achieved higher returns than the buy and hold strategy . The research community has been identified with all stocks listed on the Iraq Stock Exchange. Implementing the financial research tests requires selecting a sample from the research community that fulfills the test requirements according to a number of conditions So (38) companies we
... Show MoreDue to the continuing demand for larger bandwidth, the optical transport becoming general in the access network. Using optical fiber technologies, the communications infrastructure becomes powerful, providing very high speeds to transfer a high capacity of data. Existing telecommunications infrastructures is currently widely used Passive Optical Network that apply Wavelength Division Multiplexing (WDM) and is awaited to play an important role in the future Internet supporting a large diversity of services and next generation networks. This paper presents a design of WDM-PON network, the simulation and analysis of transmission parameters in the Optisystem 7.0 environment for bidirectional traffic. The simulation shows the behavior of optical
... Show MoreImage quality plays a vital role in improving and assessing image compression performance. Image compression represents big image data to a new image with a smaller size suitable for storage and transmission. This paper aims to evaluate the implementation of the hybrid techniques-based tensor product mixed transform. Compression and quality metrics such as compression-ratio (CR), rate-distortion (RD), peak signal-to-noise ratio (PSNR), and Structural Content (SC) are utilized for evaluating the hybrid techniques. Then, a comparison between techniques is achieved according to these metrics to estimate the best technique. The main contribution is to improve the hybrid techniques. The proposed hybrid techniques are consisting of discrete wavel
... Show MoreFingerprints are commonly utilized as a key technique and for personal recognition and in identification systems for personal security affairs. The most widely used fingerprint systems utilizing the distribution of minutiae points for fingerprint matching and representation. These techniques become unsuccessful when partial fingerprint images are capture, or the finger ridges suffer from lot of cuts or injuries or skin sickness. This paper suggests a fingerprint recognition technique which utilizes the local features for fingerprint representation and matching. The adopted local features have determined using Haar wavelet subbands. The system was tested experimentally using FVC2004 databases, which consists of four datasets, each set holds
... Show MoreIn modern technology, the ownership of electronic data is the key to securing their privacy and identity from any trace or interference. Therefore, a new identity management system called Digital Identity Management, implemented throughout recent years, acts as a holder of the identity data to maintain the holder’s privacy and prevent identity theft. Therefore, an overwhelming number of users have two major problems, users who own data and third-party applications will handle it, and users who have no ownership of their data. Maintaining these identities will be a challenge these days. This paper proposes a system that solves the problem using blockchain technology for Digital Identity Management systems. Blockchain is a powerful techniqu
... Show MoreThe esterification reaction of ethyl alcohol and acetic acid catalyzed by the ion exchange resin, Amberlyst 15, was investigated. The experimental study was implemented in an isothermal batch reactor. Catalyst loading, initial molar ratio, mixing time and temperature as being the most effective parameters, were extensively studied and discussed. A maximum final conversion of 75% was obtained at 70°C, acid to ethyl alcohol mole ratio of 1/2 and 10 g catalyst loading. Kinetic of the reaction was correlated with Langmuir-Hanshelwood model (LHM). The total rate constant and the adsorption equilibrium of water as a function of the temperature was calculated. The activation energies were found to be as 113876.9 and -49474.95 KJ per Kmol of ac
... Show MoreData mining has the most important role in healthcare for discovering hidden relationships in big datasets, especially in breast cancer diagnostics, which is the most popular cause of death in the world. In this paper two algorithms are applied that are decision tree and K-Nearest Neighbour for diagnosing Breast Cancer Grad in order to reduce its risk on patients. In decision tree with feature selection, the Gini index gives an accuracy of %87.83, while with entropy, the feature selection gives an accuracy of %86.77. In both cases, Age appeared as the most effective parameter, particularly when Age<49.5. Whereas Ki67 appeared as a second effective parameter. Furthermore, K- Nearest Neighbor is based on the minimu
... Show MoreIn digital images, protecting sensitive visual information against unauthorized access is considered a critical issue; robust encryption methods are the best solution to preserve such information. This paper introduces a model designed to enhance the performance of the Tiny Encryption Algorithm (TEA) in encrypting images. Two approaches have been suggested for the image cipher process as a preprocessing step before applying the Tiny Encryption Algorithm (TEA). The step mentioned earlier aims to de-correlate and weaken adjacent pixel values as a preparation process before the encryption process. The first approach suggests an Affine transformation for image encryption at two layers, utilizing two different key sets for each layer. Th
... Show More