2019 article

Analysis of Approximate Message Passing With Non-Separable Denoisers and Markov Random Field Priors

Ma, Y., Rush, C., & Baron, D. (2019, November). IEEE TRANSACTIONS ON INFORMATION THEORY, Vol. 65, pp. 7367–7389.

By: Y. Ma n, C. Rush* & D. Baron n

co-author countries: United States of America 🇺🇸
author keywords: Approximation algorithms; Task analysis; Approximate message passing; non-separable denoiser; Markov random field; finite sample analysis
Source: Web Of Science
Added: February 3, 2020

Approximate message passing (AMP) is a class of low-complexity, scalable algorithms for solving high-dimensional linear regression tasks where one wishes to recover an unknown signal from noisy, linear measurements. AMP is an iterative algorithm that performs estimation by updating an estimate of the unknown signal at each iteration and the performance of AMP (quantified, for example, by the mean squared error of its estimates) depends on the choice of a “denoiser” function that is used to produce these signal estimates at each iteration. An attractive feature of AMP is that its performance can be tracked by a scalar recursion referred to as state evolution. Previous theoretical analysis of the accuracy of the state evolution predictions has been limited to the use of only separable denoisers or block-separable denoisers, a class of denoisers that underperform when sophisticated dependencies exist between signal entries. Since signals with entrywise dependencies are common in image/video-processing applications, in this work we study the high-dimensional linear regression task when the dependence structure of the input signal is modeled by a Markov random field prior distribution. We provide a rigorous analysis of the performance of AMP, demonstrating the accuracy of the state evolution predictions, when a class of non-separable sliding-window denoisers is applied. Moreover, we provide numerical examples where AMP with sliding-window denoisers can successfully capture local dependencies in images.