This paper presents a parametric audio compression scheme intended for scalable audio coding applications, and is particularly well suited for operation at low rates, in the vicinity of 5 to 32 Kbps. The model consists of two complementary components: Sines plus Noise (SN). The principal component of the system is an. overlap-add analysis-by-synthesis sinusoidal model based on conjugate matching pursuits. Perceptual information about human hearing is explicitly included into the model by psychoacoustically weighting the pursuit metric. Once analyzed, SN parameters are efficiently quantized and coded. Our informal listening tests demonstrated that our coder gave competitive performance to the-state-of-the- art HelixTM Producer Plus 9 from Real Networks®, and on the average our coder Offered a 20 percent lower bitrate for the some audio quality. The audio coder gives a much wider range of scalability than previous work of sinusoidal coders as well as existing commercial audio coders. Moreover, the audio coder a gracefully degrades in quality from hi-fidelity to a reasonable quality at a very low bitrate, 5 Kbps. The most obvious application for the SN coder is in scalable, high fidelity audio coding and signal modification.
In the last few years, the literature conferred a great interest in studying the feasibility of using memristive devices for computing. Memristive devices are important in structure, dynamics, as well as functionalities of artificial neural networks (ANNs) because of their resemblance to biological learning in synapses and neurons regarding switching characteristics of their resistance. Memristive architecture consists of a number of metastable switches (MSSs). Although the literature covered a variety of memristive applications for general purpose computations, the effect of low or high conductance of each MSS was unclear. This paper focuses on finding a potential criterion to calculate the conductance of each MMS rather t
... Show MoreThe most significant function in oil exploration is determining the reservoir facies, which are based mostly on the primary features of rocks. Porosity, water saturation, and shale volume as well as sonic log and Bulk density are the types of input data utilized in Interactive Petrophysics software to compute rock facies. These data are used to create 15 clusters and four groups of rock facies. Furthermore, the accurate matching between core and well-log data is established by the neural network technique. In the current study, to evaluate the applicability of the cluster analysis approach, the result of rock facies from 29 wells derived from cluster analysis were utilized to redistribute the petrophysical properties for six units of Mishri
... Show More
The emphasis of Master Production Scheduling (MPS) or tactic planning is on time and spatial disintegration of the cumulative planning targets and forecasts, along with the provision and forecast of the required resources. This procedure eventually becomes considerably difficult and slow as the number of resources, products and periods considered increases. A number of studies have been carried out to understand these impediments and formulate algorithms to optimise the production planning problem, or more specifically the master production scheduling (MPS) problem. These algorithms include an Evolutionary Algorithm called Genetic Algorithm, a Swarm Intelligence methodology called Gravitational Search Algorithm (GSA), Bat Algorithm (BAT), T
... Show MoreThe demand for single photon sources in quantum key distribution (QKD) systems has necessitated the use of weak coherent pulses (WCPs) characterized by a Poissonian distribution. Ensuring security against eavesdropping attacks requires keeping the mean photon number (µ) small and known to legitimate partners. However, accurately determining µ poses challenges due to discrepancies between theoretical calculations and practical implementation. This paper introduces two experiments. The first experiment involves theoretical calculations of µ using several filters to generate the WCPs. The second experiment utilizes a variable attenuator to generate the WCPs, and the value of µ was estimated from the photons detected by the BB
... Show MoreOptimizing the Access Point (AP) deployment has a great role in wireless applications due to the need for providing an efficient communication with low deployment costs. Quality of Service (QoS), is a major significant parameter and objective to be considered along with AP placement as well the overall deployment cost. This study proposes and investigates a multi-level optimization algorithm called Wireless Optimization Algorithm for Indoor Placement (WOAIP) based on Binary Particle Swarm Optimization (BPSO). WOAIP aims to obtain the optimum AP multi-floor placement with effective coverage that makes it more capable of supporting QoS and cost-effectiveness. Five pairs (coverage, AP deployment) of weights, signal thresholds and received s
... Show MoreIn this work, an optical fiber biomedical sensor for detecting the ratio of the hemoglobin in the blood is presented. A surface plasmon resonance (SPR)-based coreless optical fiber was developed and implemented using single- and multi-mode optical fibers. The sensor is also utilized to evaluate refractive indices and concentrations of hemoglobin in blood samples, with 40 nm thickness of (20 nm Au and 20 nm Ag) to increase the sensitivity. It is found in practice that when the sensitive refractive index increases, the resonant wavelength increases due to the decrease in energy.
<p>In combinatorial testing development, the fabrication of covering arrays is the key challenge by the multiple aspects that influence it. A wide range of combinatorial problems can be solved using metaheuristic and greedy techniques. Combining the greedy technique utilizing a metaheuristic search technique like hill climbing (HC), can produce feasible results for combinatorial tests. Methods based on metaheuristics are used to deal with tuples that may be left after redundancy using greedy strategies; then the result utilization is assured to be near-optimal using a metaheuristic algorithm. As a result, the use of both greedy and HC algorithms in a single test generation system is a good candidate if constructed correctly. T
... Show More