Spacer ICASSP '98 Main Page

Spacer
General Information
Spacer
Conference Schedule
Spacer
Technical Program
Spacer
    Overview
    50th Annivary Events
    Plenary Sessions
    Special Sessions
    Tutorials
    Technical Sessions
    
By Date
    May 12, Tue
May 13, Wed
May 14, Thur
May 15, Fri
    
By Category
    AE    ANNIV   
COMM    DSP   
IMDSP    MMSP   
NNSP    PLEN   
SP    SPEC   
SSAP    UA   
VLSI   
    
By Author
    A    B    C    D    E   
F    G    H    I    J   
K    L    M    N    O   
P    Q    R    S    T   
U    V    W    X    Y   
Z   

    Invited Speakers
Spacer
Registration
Spacer
Exhibits
Spacer
Social Events
Spacer
Coming to Seattle
Spacer
Satellite Events
Spacer
Call for Papers/
Author's Kit

Spacer
Future Conferences
Spacer
Help

Abstract -  SSAP11   


 
SSAP11.1

   
Efficient Multiscale Stochastic Realization
A. Frakt, A. Willsky  (MIT, USA)
Few fast statistical signal processing algorithms exist for large problems involving non-stationary processes and irregular measurements. A recently introduced class of multiscale autoregressive models indexed by trees admits signal processing algorithms which can efficiently deal with problems of this type. In this paper we provide a novel and efficient algorithm for translating any second-order prior model to a multiscale autoregressive prior model so that these efficient signal processing algorithms may be applied.
 
SSAP11.2

   
Fast, Non-Iterative Estimation of Hidden Markov Models
H. Hjalmarsson  (S3-Automatic Control, KTH, Stockholm, Sweden);   B. Ninness  (University of Newcastle, Australia)
The solution of many important signal processing problems depends on the estimation of the parameters of a Hidden Markov Model (HMM).Unfortunately, to date the only knownmethods for performing this estimation have been iterative, andtherefore computationally demanding. By way of contrast,this paper presents a new fast and non-iterative method that utilizes certainrecent `state spaced subspace system identification' (4SID) ideas from the control theory literature. A short simulation examplepresented here indicates this new technique to be almost as accurate asMaximum-Likelihood estimation, but an order of magnitude less computationallydemanding than the Baum--Welch (EM) algorithm.
 
SSAP11.3

   
A Reversible Jump Sampler for Autoregressive Time Series
P. Troughton, S. Godsill  (University of Cambridge, England, UK)
We use reversible jump Markov chain Monte Carlo (MCMC) methods to address the problem of model order uncertainty in autoregressive (AR) time series within a Bayesian framework. Efficient model jumping is achieved by proposing model space moves from the the full conditional density for the AR parameters, which is obtained analytically. This is compared with an alternative method, for which the moves are cheaper to compute, in which proposals are made only for new parameters in each move. Results are presented for both synthetic and audio time series.
 
SSAP11.4

   
A New Maximum Likelihood Gradient Algorithm for On-Line Hidden Markov Model Identification
I. Collings  (University of Melbourne, Australia);   T. Ryden  (Lund University, Sweden)
This paper presents a new algorithm for on-line identification of hidden Markov model (HMM) parameters. The scheme is gradient based, and provides parameter estimates which recursively maximise the likelihood function. It is therefore a recursive maximum likelihood (RML) algorithm, and it has optimal asymptotic properties. The only current on-line HMM identification algorithm with anything other than suboptimal rate of convergence is based on a prediction error (PE) cost function. As well as presenting a new algorithm, this paper also highlights and explains a counter-intuitive convergence problem for the current recursive PE (RPE) algorithm, when operating in low noise conditions. Importantly, this problem does not exist for the new RML algorithm. Simulation studies demonstrate the superior performance of the new algorithm, compared to current techniques.
 
SSAP11.5

   
Quasi-Newton Method for Maximum Likelihood Estimation of Hidden Markov Model
O. Cappé  (ENST / CNRS, France);   V. Buchoux  (ENST / CNET, France);   E. Moulines  (ENST / CNRS, France)
Hidden Markov models (HMMs) are used in many signal processing applications including speech recognition, blind equalization of digital communications channels, etc. The most widely used method for maximum likelihood estimation of HMM parameters is the forward-backward (or Baum-Welch) algorithm which is an early example of application of the Expectation-Maximization (EM) principle. In this contribution, an alternative fast-converging approach for maximum likelihood estimation of HMM parameters is described. This new techniques is based on the use of general purpose quasi-Newton optimization methods as well as on an efficient purely recursive algorithm for computing the log-likelihood and its derivative.
 
SSAP11.6

   
Detection and Estimation of Signals by Reversible Jump Markov Chain Monte Carlo Computations
P. Djuric  (SUNY, Stony Brook, USA);   S. Godsill, W. Fitzgerald, P. Rayner  (University of Cambridge, UK)
Markov Chain Monte Carlo (MCMC) samplers have been a very powerful methodology for estimating signal parameters. With the introduction of the reversible jump MCMC sampler, which is a Metropolis-Hastings method adapted to general state spaces, the potential of the MCMC methods has risen to a new level. Consequently, the MCMC methods currently play a major role in many research activities. In this paper we propose a reversible jump MCMC sampler based on predictive densities obtained by integrating out unwanted parameters. The proposal densities are approximations of the posterior distributions of the remaining parameters obtained by sampling importance resampling (SIR). We apply the method to the problem of signal detection and parameter estimation of signals. To illustrate the procedure, we present an example of sinusoids embedded in noise.
 
SSAP11.7

   
Modeling and Detection in Hyperspectral Imagery
S. Schweizer, J. Moura  (Carnegie Mellon University, USA)
One aim of using hyperspectral imaging sensors is in discriminating man-made objects from dominant clutter environments. Sensors like Aviris or Hydice simultaneously collect hundreds of contiguous and narrowly spaced spectral band images for the same scene. The challenge lies in processing the corresponding large volume of data that is collected by the sensors. Usual implementations of the Maximum-Likelihood (ML) detector are precluded because they require the inversion of large data covariance matrices. We apply a Gauss-Markov random field (GMRF) model to derive a computationally efficient ML-detector implementation that avoids inversion of the covariance matrix. The paper details the structure of the GMRF model, presents an estimation algorithm to fit the GMRF to the hyperspectral sensor data, and finally, develops the structure of the ML-detector.
 
SSAP11.8

   
Simplified Wavelet-domain Hidden Markov Models Using Contexts
M. Crouse, R. Baraniuk  (Rice University, USA)
Wavelet-domain Hidden Markov Models (HMMs) are a potent new tool for modeling the statistical properties of wavelet transforms. In addition to characterizing the statistics of individual wavelet coefficients, HMMs capture the salient interactions between wavelet coefficients. However, as we model an increasing number of wavelet coefficient interactions, HMM-based signal processing becomes increasingly complicated. In this paper, we propose a new approach to HMMs based on the notion of {\emcontext.} By modeling wavelet coefficient inter-dependencies via contexts, we retain the approximation capabilities of HMMs, yet substantially reduce their complexity. To illustrate the power of this approach, we develop new algorithms for signal estimation and for efficient synthesis of nonGaussian, long-range-dependent network traffic.
 

< Previous Abstract - SSAP10

SSAP12 - Next Abstract >