Markov Models in SP

Chair: Petar M. Djuric, State University of New York at Stony Brook, USA

Home


Efficient Multiscale Stochastic Realization

Authors:

Austin B Frakt, MIT (U.S.A.)
Alan S Willsky, MIT (U.S.A.)

Volume 4, Page 2249, Paper number 1195

Abstract:

Few fast statistical signal processing algorithms exist for large problems involving non-stationary processes and irregular measurements. A recently introduced class of multiscale autoregressive models indexed by trees admits signal processing algorithms which can efficiently deal with problems of this type. In this paper we provide a novel and efficient algorithm for translating any second-order prior model to a multiscale autoregressive prior model so that these efficient signal processing algorithms may be applied.

ic981195.pdf (From Postscript)

TOP



Fast, Non-Iterative Estimation of Hidden Markov Models

Authors:

Hakan Hjalmarsson, KTH (Sweden)
Brett M. Ninness, University of Newcastle (Australia)

Volume 4, Page 2253, Paper number 1389

Abstract:

The solution of many important signal processing problems depends on the estimation of the parameters of a Hidden Markov Model (HMM).Unfortunately, to date the only knownmethods for performing this estimation have been iterative, andtherefore computationally demanding. By way of contrast,this paper presents a new fast and non-iterative method that utilizes certainrecent `state spaced subspace system identification' (4SID) ideas from the control theory literature. A short simulation examplepresented here indicates this new technique to be almost as accurate asMaximum-Likelihood estimation, but an order of magnitude less computationallydemanding than the Baum--Welch (EM) algorithm.

ic981389.pdf (From Postscript)

TOP



A Reversible Jump Sampler for Autoregressive Time Series

Authors:

Paul T. Troughton, University of Cambridge (U.K.)
Simon J. Godsill, University of Cambridge (U.K.)

Volume 4, Page 2257, Paper number 1513

Abstract:

We use reversible jump Markov chain Monte Carlo (MCMC) methods to address the problem of model order uncertainty in autoregressive (AR) time series within a Bayesian framework. Efficient model jumping is achieved by proposing model space moves from the the full conditional density for the AR parameters, which is obtained analytically. This is compared with an alternative method, for which the moves are cheaper to compute, in which proposals are made only for new parameters in each move. Results are presented for both synthetic and audio time series.

ic981513.pdf (From Postscript)

TOP



A New Maximum Likelihood Gradient Algorithm for On-Line Hidden Markov Model Identification

Authors:

Iain B Collings, University of Melbourne (Australia)
Tobias Ryden, Lund University (Sweden)

Volume 4, Page 2261, Paper number 1884

Abstract:

This paper presents a new algorithm for on-line identification of hidden Markov model (HMM) parameters. The scheme is gradient based, and provides parameter estimates which recursively maximise the likelihood function. It is therefore a recursive maximum likelihood (RML) algorithm, and it has optimal asymptotic properties. The only current on-line HMM identification algorithm with anything other than suboptimal rate of convergence is based on a prediction error (PE)cost function. As well as presenting a new algorithm, this paper also highlights and explains a counter-intuitive convergence problem for the current recursive PE (RPE) algorithm, when operating in low noise conditions. Importantly, this problem does not exist for the new RML algorithm. Simulation studies demonstrate the superior performance of the new algorithm, compared to current techniques.

ic981884.pdf (From Postscript)

TOP



Quasi-Newton Method for Maximum Likelihood Estimation of Hidden Markov Model

Authors:

Olivier Cappé, ENST / CNRS (France)
Vincent Buchoux, ENST / CNET (France)
Eric Moulines, ENST / CNRS (France)

Volume 4, Page 2265, Paper number 1944

Abstract:

Hidden Markov models (HMMs) are used in many signal processing applications including speech recognition, blind equalization of digital communications channels, etc. The most widely used method for maximum likelihood estimation of HMM parameters is the forward-backward (or Baum-Welch) algorithm which is an early example of application of the Expectation-Maximization (EM) principle. In this contribution, an alternative fast-converging approach for maximum likelihood estimation of HMM parameters is described. This new techniques is based on the use of general purpose quasi-Newton optimization methods as well as on an efficient purely recursive algorithm for computing the log-likelihood and its derivative.

ic981944.pdf (From Postscript)

TOP



Detection and Estimation of Signals by Reversible Jump Markov Chain Monte Carlo Computations

Authors:

Petar M. Djuric, SUNY, Stony Brook (U.S.A.)
Simon J. Godsill, University of Cambridge (U.K.)
William J. Fitzgerald, University of Cambridge (U.K.)
Peter J.W. Rayner, University of Cambridge (U.K.)

Volume 4, Page 2269, Paper number 2007

Abstract:

Markov Chain Monte Carlo (MCMC) samplers have been a very powerful methodology for estimating signal parameters. With the introduction of the reversible jump MCMC sampler, which is a Metropolis-Hastings method adapted to general state spaces, the potential of the MCMC methods has risen to a new level. Consequently, the MCMC methods currently play a major role in many research activities. In this paper we propose a reversible jump MCMC sampler based on predictive densities obtained by integrating out unwanted parameters. The proposal densities are approximations of the posterior distributions of the remaining parameters obtained by sampling importance resampling (SIR). We apply the method to the problem of signal detection and parameter estimation of signals. To illustrate the procedure, we present an example of sinusoids embedded in noise.

ic982007.pdf (From Postscript)

TOP



Modeling and Detection in Hyperspectral Imagery

Authors:

Susan M Schweizer, Carnegie Mellon University (U.S.A.)
José M.F. Moura, Carnegie Mellon University (U.S.A.)

Volume 4, Page 2273, Paper number 2058

Abstract:

One aim of using hyperspectral imaging sensors is in discriminating man-made objects from dominant clutter environments. Sensors like Aviris or Hydice simultaneously collect hundreds of contiguous and narrowly spaced spectral band images for the same scene. The challenge lies in processing the corresponding large volume of data that is collected by the sensors. Usual implementations of the Maximum-Likelihood (ML) detector are precluded because they require the inversion of large data covariance matrices. We apply a Gauss-Markov random field (GMRF) model to derive a computationally efficient ML-detector implementation that avoids inversion of the covariance matrix. The paper details the structure of the GMRF model, presents an estimation algorithm to fit the GMRF to the hyperspectral sensor data, and finally, develops the structure of the ML-detector.

ic982058.pdf (From Postscript)

TOP



Simplified Wavelet-domain Hidden Markov Models Using Contexts

Authors:

Matthew S Crouse, Rice University (U.S.A.)
Richard G Baraniuk, Rice University (U.S.A.)

Volume 4, Page 2277, Paper number 2092

Abstract:

Wavelet-domain Hidden Markov Models (HMMs) are a potent new tool for modeling the statistical properties of wavelet transforms. In addition to characterizing the statistics of individual wavelet coefficients, HMMs capture the salient interactions between wavelet coefficients. However, as we model an increasing number of wavelet coefficient interactions, HMM-based signal processing becomes increasingly complicated. In this paper, we propose a new approach to HMMs based on the notion of context. By modeling wavelet coefficient inter-dependencies via contexts, we retain the approximation capabilities of HMMs, yet substantially reduce their complexity. To illustrate the power of this approach, we develop new algorithms for signal estimation and for efficient synthesis of non Gaussian, long-range-dependent network traffic.

ic982092.pdf (From Postscript)

TOP