Grabisch and Labreuche have recently proposed a generalization of capacities, called the bi-capacities. Recently, a new approach for studying bi-capacities through introducing a notion of ternary-element sets proposed by the author. In this paper, we propose many results such as bipolar Mobius transform, importance index, and interaction index of bi-capacities based on our approach.
The aim of the current research is to reveal the effect of using brain-based learning theory strategies on the achievement of Art Education students in the subject of Teaching Methods. The experimental design with two equal experimental and control groups was used. The experimental design with two independent and equal groups was used, and the total of the research sample was (60) male and female students, (30) male and female students represented the experimental group, and (30) male and female students represented the control group. The researcher prepared the research tool represented by the cognitive achievement test consisting of (20) questions, and it was characterized by honesty and reliability, and the experiment lasted (6) weeks
... Show MoreFeatures is the description of the image contents which could be corner, blob or edge. Corners are one of the most important feature to describe image, therefore there are many algorithms to detect corners such as Harris, FAST, SUSAN, etc. Harris is a method for corner detection and it is an efficient and accurate feature detection method. Harris corner detection is rotation invariant but it isn’t scale invariant. This paper presents an efficient harris corner detector invariant to scale, this improvement done by using gaussian function with different scales. The experimental results illustrate that it is very useful to use Gaussian linear equation to deal with harris weakness.
In this paper, a fast lossless image compression method is introduced for compressing medical images, it is based on splitting the image blocks according to its nature along with using the polynomial approximation to decompose image signal followed by applying run length coding on the residue part of the image, which represents the error caused by applying polynomial approximation. Then, Huffman coding is applied as a last stage to encode the polynomial coefficients and run length coding. The test results indicate that the suggested method can lead to promising performance.
In this work we present a technique to extract the heart contours from noisy echocardiograph images. Our technique is based on improving the image before applying contours detection to reduce heavy noise and get better image quality. To perform that, we combine many pre-processing techniques (filtering, morphological operations, and contrast adjustment) to avoid unclear edges and enhance low contrast of echocardiograph images, after implementing these techniques we can get legible detection for heart boundaries and valves movement by traditional edge detection methods.
In this paper, an algorithm for binary codebook design has been used in vector quantization technique, which is used to improve the acceptability of the absolute moment block truncation coding (AMBTC) method. Vector quantization (VQ) method is used to compress the bitmap (the output proposed from the first method (AMBTC)). In this paper, the binary codebook can be engender for many images depending on randomly chosen to the code vectors from a set of binary images vectors, and this codebook is then used to compress all bitmaps of these images. The chosen of the bitmap of image in order to compress it by using this codebook based on the criterion of the average bitmap replacement error (ABPRE). This paper is suitable to reduce bit rates
... Show MoreVarious speech enhancement Algorithms (SEA) have been developed in the last few decades. Each algorithm has its advantages and disadvantages because the speech signal is affected by environmental situations. Distortion of speech results in the loss of important features that make this signal challenging to understand. SEA aims to improve the intelligibility and quality of speech that different types of noise have degraded. In most applications, quality improvement is highly desirable as it can reduce listener fatigue, especially when the listener is exposed to high noise levels for extended periods (e.g., manufacturing). SEA reduces or suppresses the background noise to some degree, sometimes called noise suppression alg
... Show MoreThe quantum chromodynamics theory approach was taken to study the photonic emission from interaction of quark gluon at high at Bremsstrahlung processes. Strength coupling, quark charge 𝑒𝑞 , flavor number 𝑛𝐹 , thermal energy T of system, fugacity of gluon ƛ𝑔, fugacity of quark ƛ𝑞 , critical temperature 𝑇𝐶 and photons energy 𝐸 are taken to calculate photons rate via the quantum system. Photons emission rate studies and calculates via high energy 400MeV to 650 MeV using flavor number 3 and 7 for 𝑢̅𝑔 → 𝑑̅𝑔𝛾 and 𝑐𝑔 → 𝑠𝑔𝛾 systems at bremsstrahlung processes with critical temperature (𝑇𝑐 = 190 and 196) MeV with photons energy (1-10) GeV. The confinement and de-confineme
... Show MoreThe quantum chromodynamics theory approach was taken to study the photonic emission from interaction of quark gluon at high at Bremsstrahlung processes. Strength coupling, quark charge 𝑒𝑞 , flavor number 𝑛𝐹 , thermal energy T of system, fugacity of gluon ƛ𝑔, fugacity of quark ƛ𝑞 , critical temperature 𝑇𝐶 and photons energy 𝐸 are taken to calculate photons rate via the quantum system. Photons emission rate studies and calculates via high energy 400MeV to 650 MeV using flavor number 3 and 7 for 𝑢̅𝑔 → 𝑑̅𝑔𝛾 and 𝑐𝑔 → 𝑠𝑔𝛾 systems at bremsstrahlung processes with critical temperature (𝑇𝑐 = 190 and 196) MeV with photons energy (1-10) GeV. The confinement and de-confineme
... Show MoreIn this article, we developed a new loss function, as the simplification of linear exponential loss function (LINEX) by weighting LINEX function. We derive a scale parameter, reliability and the hazard functions in accordance with upper record values of the Lomax distribution (LD). To study a small sample behavior performance of the proposed loss function using a Monte Carlo simulation, we make a comparison among maximum likelihood estimator, Bayesian estimator by means of LINEX loss function and Bayesian estimator using square error loss (SE) function. The consequences have shown that a modified method is the finest for valuing a scale parameter, reliability and hazard functions.