This research is devoted to investigate relationship between both Ultrasonic Pulse Velocity and Rebound Number (Hammer Test) with cube compressive strength and also to study the effect of steel reinforcement on these relationships.
A study was carried out on 32 scale model reinforced concrete elements. Non destructive testing campaign (mainly ultrasonic and rebound hammer tests) made on the same elements. About 72 concrete cubes (15 X 15 X15) were taken from the concrete mixes to check the compressive strength.. Data analyzed.Include the possible correlations between non destructive testing (NDT) and compressive strength (DT) Statistical approach is used for this purpose. A new relationships obtained from correlations results is give
This paper presents the application of a framework of fast and efficient compressive sampling based on the concept of random sampling of sparse Audio signal. It provides four important features. (i) It is universal with a variety of sparse signals. (ii) The number of measurements required for exact reconstruction is nearly optimal and much less then the sampling frequency and below the Nyquist frequency. (iii) It has very low complexity and fast computation. (iv) It is developed on the provable mathematical model from which we are able to quantify trade-offs among streaming capability, computation/memory requirement and quality of reconstruction of the audio signal. Compressed sensing CS is an attractive compression scheme due to its uni
... Show MoreMachine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To
... Show MoreSecure storage of confidential medical information is critical to healthcare organizations seeking to protect patient's privacy and comply with regulatory requirements. This paper presents a new scheme for secure storage of medical data using Chaskey cryptography and blockchain technology. The system uses Chaskey encryption to ensure integrity and confidentiality of medical data, blockchain technology to provide a scalable and decentralized storage solution. The system also uses Bflow segmentation and vertical segmentation technologies to enhance scalability and manage the stored data. In addition, the system uses smart contracts to enforce access control policies and other security measures. The description of the system detailing and p
... Show MoreIn this paper, we propose a method using continuous wavelets to study the multivariate fractional Brownian motion through the deviations of the transformed random process to find an efficient estimate of Hurst exponent using eigenvalue regression of the covariance matrix. The results of simulations experiments shown that the performance of the proposed estimator was efficient in bias but the variance get increase as signal change from short to long memory the MASE increase relatively. The estimation process was made by calculating the eigenvalues for the variance-covariance matrix of Meyer’s continuous wavelet details coefficients.
In this paper, we propose a method using continuous wavelets to study the multivariate fractional Brownian motion through the deviations of the transformed random process to find an efficient estimate of Hurst exponent using eigenvalue regression of the covariance matrix. The results of simulations experiments shown that the performance of the proposed estimator was efficient in bias but the variance get increase as signal change from short to long memory the MASE increase relatively. The estimation process was made by calculating the eigenvalues for the variance-covariance matrix of Meyer’s continuous wavelet details coefficients.
Text based-image clustering (TBIC) is an insufficient approach for clustering related web images. It is a challenging task to abstract the visual features of images with the support of textual information in a database. In content-based image clustering (CBIC), image data are clustered on the foundation of specific features like texture, colors, boundaries, shapes. In this paper, an effective CBIC) technique is presented, which uses texture and statistical features of the images. The statistical features or moments of colors (mean, skewness, standard deviation, kurtosis, and variance) are extracted from the images. These features are collected in a one dimension array, and then genetic algorithm (GA) is applied for image clustering.
... Show MoreAbstract
The objective of image fusion is to merge multiple sources of images together in such a way that the final representation contains higher amount of useful information than any input one.. In this paper, a weighted average fusion method is proposed. It depends on using weights that are extracted from source images using counterlet transform. The extraction method is done by making the approximated transformed coefficients equal to zero, then taking the inverse counterlet transform to get the details of the images to be fused. The performance of the proposed algorithm has been verified on several grey scale and color test images, and compared with some present methods.
... Show Moreconventional FCM algorithm does not fully utilize the spatial information in the image. In this research, we use a FCM algorithm that incorporates spatial information into the membership function for clustering. The spatial function is the summation of the membership functions in the neighborhood of each pixel under consideration. The advantages of the method are that it is less
sensitive to noise than other techniques, and it yields regions more homogeneous than those of other methods. This technique is a powerful method for noisy image segmentation.