
Proceedings Paper • Open Access
A helicopter view of the self-consistency framework for wavelets and other signal extraction methods in the presence of missing and irregularly spaced data
Paper Abstract
A common frustration in signal processing and, more generally, information recovery is the presence of irregularities
in the data. At best, the standard software or methods will no longer be directly applicable when
data are missing, incomplete or irregularly spaced (e.g., as with wavelets). Self-consistency is a very general
and powerful statistical principle for dealing with such problems. Conceptually it is extremely appealing, for it
is essentially a mathematical formalization of iterating
common-sense "trial-and-error" methods until no more
improvement is possible. Mathematically it is elegant, with one
fixed-point equation to solve and a general
projection theorem to establish optimality. Practically it is straightforward to program because it directly uses
the regular/complete-data method for iteration. Its major disadvantage is that it can be computationally intensive.
However, increasingly efficient (approximate) implementations are being discovered, such as for wavelet
de-noising with hard and soft thresholding. This brief overview summarizes the author's keynote presentation on
those points, based on joint work with Thomas Lee on wavelet applications and with Zhan Li on the theoretical
properties of the self-consistent estimators.
Paper Details
Date Published: 27 September 2007
PDF: 10 pages
Proc. SPIE 6701, Wavelets XII, 670124 (27 September 2007); doi: 10.1117/12.735291
Published in SPIE Proceedings Vol. 6701:
Wavelets XII
Dimitri Van De Ville; Vivek K. Goyal; Manos Papadakis, Editor(s)
PDF: 10 pages
Proc. SPIE 6701, Wavelets XII, 670124 (27 September 2007); doi: 10.1117/12.735291
Show Author Affiliations
Xiao-Li Meng, Harvard Univ. (United States)
Published in SPIE Proceedings Vol. 6701:
Wavelets XII
Dimitri Van De Ville; Vivek K. Goyal; Manos Papadakis, Editor(s)
© SPIE. Terms of Use
