Background/Objectives: The purpose of current research aims to a modified image representation framework for Content-Based Image Retrieval (CBIR) through gray scale input image, Zernike Moments (ZMs) properties, Local Binary Pattern (LBP), Y Color Space, Slantlet Transform (SLT), and Discrete Wavelet Transform (DWT). Methods/Statistical analysis: This study surveyed and analysed three standard datasets WANG V1.0, WANG V2.0, and Caltech 101. The features an image of objects in this sets that belong to 101 classes-with approximately 40-800 images for every category. The suggested infrastructure within the study seeks to present a description and operationalization of the CBIR system through automated attribute extraction system premised on CNN infrastructure. Findings: The results acquired through the investigated CBIR system alongside the benchmarked results have clearly indicated that the suggested technique had the best performance with the overall accuracy at 88.29% as opposed to the other sets of data adopted in the experiments. The outstanding results indicate clearly that the suggested method was effective for all the sets of data. Improvements/Applications: As a result of this study, it was found the revealed that the multiple image representation was redundant for extraction accuracy, and the findings from the study indicated that automatically retrieved features are capable and reliable in generating accurate outcomes.
The Hartley transform generalizes to the fractional Hartley transform (FRHT) which gives various uses in different fields of image encryption. Unfortunately, the available literature of fractional Hartley transform is unable to provide its inversion theorem. So accordingly original function cannot retrieve directly, which restrict its applications. The intension of this paper is to propose inversion theorem of fractional Hartley transform to overcome this drawback. Moreover, some properties of fractional Hartley transform are discussed in this paper.
Over the years, the prediction of penetration rate (ROP) has played a key rule for drilling engineers due it is effect on the optimization of various parameters that related to substantial cost saving. Many researchers have continually worked to optimize penetration rate. A major issue with most published studies is that there is no simple model currently available to guarantee the ROP prediction.
The main objective of this study is to further improve ROP prediction using two predictive methods, multiple regression analysis (MRA) and artificial neural networks (ANNs). A field case in SE Iraq was conducted to predict the ROP from a large number of parame
Quadrupole Q moments and effective charges are calculated for 9C, 11C, 17C and 19C exotic nuclei using shell model calculations. Excitations out of major shell space are taken into account through a microscopic theory which are called core-polarization effects. The simple harmonic oscillator potential is used to generate the single particle matrix elements of 9,11,17,19C. The present calculations with core-polarization effects reproduced the experimental and theoretical data very well.
A genetic algorithm model coupled with artificial neural network model was developed to find the optimal values of upstream, downstream cutoff lengths, length of floor and length of downstream protection required for a hydraulic structure. These were obtained for a given maximum difference head, depth of impervious layer and degree of anisotropy. The objective function to be minimized was the cost function with relative cost coefficients for the different dimensions obtained. Constraints used were those that satisfy a factor of safety of 2 against uplift pressure failure and 3 against piping failure.
Different cases reaching 1200 were modeled and analyzed using geo-studio modeling, with different values of input variables. The soil wa
Rainfall in Nigeria is highly dynamic and variable on a temporal and spatial scale. This has taken a more pronounced dimension due to climate change. In this study, Standard Precipitation Index (SPI) and Mann-Kendall test statistical tools were employed to analyze rainfall trends and patterns in Gombe metropolis between 1990 and 2020 and the ARIMA model was used for making the forecast for ten (10) years. Daily rainfall data of 31 years obtained from Nigerian Meteorological Agency, (NIMET) was used for the study. The daily rainfall data was subjected to several analyses. Standard precipitation index showed that alternation of wet and dry period conditions had been witnessed in the study area. The result obtained showed that there is an u
... Show MoreElectrocardiogram (ECG) is an important physiological signal for cardiac disease diagnosis. With the increasing use of modern electrocardiogram monitoring devices that generate vast amount of data requiring huge storage capacity. In order to decrease storage costs or make ECG signals suitable and ready for transmission through common communication channels, the ECG data
volume must be reduced. So an effective data compression method is required. This paper presents an efficient technique for the compression of ECG signals. In this technique, different transforms have been used to compress the ECG signals. At first, a 1-D ECG data was segmented and aligned to a 2-D data array, then 2-D mixed transform was implemented to compress the
This paper concerns with openness concept in contemporary learning environment, which ranges from physical characters to its relation with learning efficiency and its output. Previous literatures differ to clear the effect of openness on the engagement between learner within themselves, and with this kind of spaces. Engagement means: active participation, the ability of making dialogue, self-reflection and the ability to explore and communicate with them and
within learning space. Research roblem was: The lack of knowledge about the effect of Openness on learner engagement with learning spaces. The two concepts were applied on three types of learning spaces in the Department of the Architectu
In this study, multi-objective optimization of nanofluid aluminum oxide in a mixture of water and ethylene glycol (40:60) is studied. In order to reduce viscosity and increase thermal conductivity of nanofluids, NSGA-II algorithm is used to alter the temperature and volume fraction of nanoparticles. Neural network modeling of experimental data is used to obtain the values of viscosity and thermal conductivity on temperature and volume fraction of nanoparticles. In order to evaluate the optimization objective functions, neural network optimization is connected to NSGA-II algorithm and at any time assessment of the fitness function, the neural network model is called. Finally, Pareto Front and the corresponding optimum points are provided and
... Show MoreOne of the important differences between multiwavelets and scalar wavelets is that each channel in the filter bank has a vector-valued input and a vector-valued output. A scalar-valued input signal must somehow be converted into a suitable vector-valued signal. This conversion is called preprocessing. Preprocessing is a mapping process which is done by a prefilter. A postfilter just does the opposite.
The most obvious way to get two input rows from a given signal is to repeat the signal. Two rows go into the multifilter bank. This procedure is called “Repeated Row” which introduces oversampling of the data by a factor of 2.
For data compression, where one is trying to find compact transform representations for a
... Show More