In this paper, a fusion of K models of full-rank weighted nonnegative tensor factor two-dimensional deconvolution (K-wNTF2D) is proposed to separate the acoustic sources that have been mixed in an underdetermined reverberant environment. The model is adapted in an unsupervised manner under the hybrid framework of the generalized expectation maximization and multiplicative update algorithms. The derivation of the algorithm and the development of proposed full-rank K-wNTF2D will be shown. The algorithm also encodes a set of variable sparsity parameters derived from Gibbs distribution into the K-wNTF2D model. This optimizes each sub-model in K-wNTF2D with the required sparsity to model the time-varying variances of the sources in the spectrogram. In addition, an initialization method is proposed to initialize the parameters in the K-wNTF2D. Experimental results on the underdetermined reverberant mixing environment have shown that the proposed algorithm is effective at separating the mixture with an average signal-to-distortion ratio of 3 dB.
ABSTRUCT
In This Paper, some semi- parametric spatial models were estimated, these models are, the semi – parametric spatial error model (SPSEM), which suffer from the problem of spatial errors dependence, and the semi – parametric spatial auto regressive model (SPSAR). Where the method of maximum likelihood was used in estimating the parameter of spatial error ( λ ) in the model (SPSEM), estimated the parameter of spatial dependence ( ρ ) in the model ( SPSAR ), and using the non-parametric method in estimating the smoothing function m(x) for these two models, these non-parametric methods are; the local linear estimator (LLE) which require finding the smoo
... Show MoreThe region-based association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease. Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls. To tackle the problem of the sparse distribution, a two-stage approach was proposed in literature: In the first stage, haplotypes are computationally inferred from genotypes, followed by a haplotype coclassification. In the second stage, the association analysis is performed on the inferred haplotype groups. If a haplotype is unevenly distributed between the case and control samples, this haplotype is labeled
... Show MoreAbstract
Travel Time estimation and reliability measurement is an important issues for improving operation efficiency and safety of traffic roads networks. The aim of this research is the estimation of total travel time and distribution analysis for three selected links in Palestine Arterial Street in Baghdad city. Buffer time index results in worse reliability conditions. Link (2) from Bab Al Mutham intersection to Al-Sakara intersection produced a buffer index of about 36% and 26 % for Link (1) Al-Mawall intersection to Bab Al- Mutham intersection and finally for link (3) which presented a 24% buffer index. These illustrated that the reliability get worst for link
... Show MoreIn this paper a new method is proposed to perform the N-Radon orthogonal frequency division multiplexing (OFDM), which are equivalent to 4-quadrature amplitude modulation (QAM), 16-QAM, 64-QAM, 256-QAM, ... etc. in spectral efficiency. This non conventional method is proposed in order to reduce the constellation energy and increase spectral efficiency. The proposed method gives a significant improvement in Bit Error Rate performance, and keeps bandwidth efficiency and spectrum shape as good as conventional Fast Fourier Transform based OFDM. The new structure was tested and compared with conventional OFDM for Additive White Gaussian Noise, flat, and multi-path selective fading channels. Simulation tests were generated for different channels
... Show MoreThe survival analysis is one of the modern methods of analysis that is based on the fact that the dependent variable represents time until the event concerned in the study. There are many survival models that deal with the impact of explanatory factors on the likelihood of survival, including the models proposed by the world, David Cox, one of the most important and common models of survival, where it consists of two functions, one of which is a parametric function that does not depend on the survival time and the other a nonparametric function that depends on times of survival, which the Cox model is defined as a semi parametric model, The set of parametric models that depend on the time-to-event distribution parameters such as
... Show MoreIn this paper has been one study of autoregressive generalized conditional heteroscedasticity models existence of the seasonal component, for the purpose applied to the daily financial data at high frequency is characterized by Heteroscedasticity seasonal conditional, it has been depending on Multiplicative seasonal Generalized Autoregressive Conditional Heteroscedastic Models Which is symbolized by the Acronym (SGARCH) , which has proven effective expression of seasonal phenomenon as opposed to the usual GARCH models. The summarizing of the research work studying the daily data for the price of the dinar exchange rate against the dollar, has been used autocorrelation function to detect seasonal first, then was diagnosed wi
... Show MoreIn light of the development in computer science and modern technologies, the impersonation crime rate has increased. Consequently, face recognition technology and biometric systems have been employed for security purposes in a variety of applications including human-computer interaction, surveillance systems, etc. Building an advanced sophisticated model to tackle impersonation-related crimes is essential. This study proposes classification Machine Learning (ML) and Deep Learning (DL) models, utilizing Viola-Jones, Linear Discriminant Analysis (LDA), Mutual Information (MI), and Analysis of Variance (ANOVA) techniques. The two proposed facial classification systems are J48 with LDA feature extraction method as input, and a one-dimen
... Show MoreThis study looks into the many methods that are used in the risk assessment procedure that is used in the construction industry nowadays. As a result of the slow adoption of novel assessment methods, professionals frequently resort to strategies that have previously been validated as being successful. When it comes to risk assessment, having a precise analytical tool that uses the cost of risk as a measurement and draws on the knowledge of professionals could potentially assist bridge the gap between theory and practice. This step will examine relevant literature, sort articles according to their published year, and identify domains and qualities. Consequently, the most significant findings have been presented in a manne
... Show More