Skip to content

Perturbation Bounds for Low-Rank Inverse Approximations under Noise

Conference: NeurIPS 2025 arXiv: 2510.25571 Code: None Area: Numerical Linear Algebra / Matrix Theory Keywords: Low-rank approximation, pseudoinverse, perturbation bounds, spectral norm, eigengap

TL;DR

This paper provides the first non-asymptotic spectral norm perturbation bounds for low-rank inverse approximations \(\|(\tilde{A}^{-1})_p - A_p^{-1}\|\) under additive noise. Using contour integration techniques, sharp bounds are derived that depend on the eigengap, spectral decay, and noise alignment, improving upon classical full-inverse bounds by up to a factor of \(\sqrt{n}\).

Background & Motivation

Background: Low-rank pseudoinverses are widely used to accelerate inverse matrix approximations in ML, optimization, and scientific computing. Classical perturbation theory provides bounds only for the full inverse \(\|A^{-1} - \tilde{A}^{-1}\|\).

Key Challenge: Classical bounds ignore the structure introduced by truncation to rank \(p\), neglect the interaction between noise and the eigengap, and frequently yield overly pessimistic estimates.

Core Idea: Contour integration techniques are applied to perform a local resolvent expansion of the non-entire function \(f(z) = 1/z\), enabling precise control of Riesz projection perturbations near the smallest eigenvalues.

Method

Problem Formulation

Given an \(n \times n\) real symmetric positive definite matrix \(A\) with eigenvalues \(\lambda_1 \geq \cdots \geq \lambda_n > 0\) and a symmetric perturbation \(E\), let \(\tilde{A} = A + E\). The central quantity of interest is \(\|(\tilde{A}^{-1})_p - A_p^{-1}\|\), where \(A_p^{-1} = \sum_{i=n-p+1}^{n} \lambda_i^{-1} u_i u_i^\top\) is the best rank-\(p\) approximation to \(A^{-1}\).

Key Results

  1. Main Theorem (Theorem 2.1): Under the condition \(4\|E\| \leq \min\{\lambda_n, \delta_{n-p}\}\): $\(\|(\tilde{A}^{-1})_p - A_p^{-1}\| \leq 5\left(\frac{\|E\|}{\lambda_n^2} + \frac{\|E\|}{\lambda_{n-p} \delta_{n-p}}\right)\)$ where \(\delta_{n-p} = \lambda_{n-p} - \lambda_{n-p+1}\) denotes the eigengap.

  2. Comparison with Classical Bounds: The classical EYM-N bound is \(\frac{8\|E\|}{3\lambda_n^2} + \frac{2}{\lambda_{n-p}}\), whose second term does not vanish as \(\|E\| \to 0\)—yielding nonzero error even in the noiseless case. The proposed bound correctly tends to zero as \(\|E\| \to 0\).

Technical Contributions

  1. Extension of Contour Integration: The technique is extended from entire functions (e.g., matrix exponential \(e^z\)) to the non-entire function \(f(z) = 1/z\).
  2. Localized Resolvent Expansion: A Cauchy contour is constructed around the smallest eigenvalues, amplifying the exploitation of local spectral structure.
  3. Riesz Projection Perturbation Control: The influence of noise on eigenprojections in low-curvature subspaces is precisely characterized.
  4. Extension to General Symmetric Matrices: The framework accommodates rank-deficient cases via the pseudoinverse.

Application: Improved PCG Convergence Rates

The proposed bounds yield tighter condition number estimates for PCG methods with low-rank-plus-regularization preconditioners, leading to provably improved iteration count guarantees by a factor of \(n^{1/4}\).

Key Experimental Results

Main Results

Matrix Type Tracking Error (Ours) Tracking Error (Classical)
Sample covariance Small constant factor 1–2 orders of magnitude overestimate
Discretized elliptic operator Small constant factor 1–2 orders of magnitude overestimate
BCSSTK09 stiffness Small constant factor 1–2 orders of magnitude overestimate

Noise Condition Feasibility

Experiments on the 1990 US Census covariance matrix and the BCSSTK09 stiffness matrix confirm that the safety margin comfortably exceeds typical noise levels encountered in differential privacy and structural engineering applications.

Highlights & Insights

  • Low-rank inverse approximations can be more robust than full inverses when the eigengap is sufficiently large—a result that is counterintuitive, as truncation is typically regarded as introducing additional error.
  • Spectrally aware perturbation bounds are far more informative in practice than worst-case bounds, as they exploit the spectral structure of the matrix.
  • PCG convergence rates can be improved by a factor of \(n^{1/4}\) using the proposed bounds, offering direct algorithmic impact.
  • Technically, extending contour integration from entire to non-entire functions constitutes a non-trivial mathematical contribution.

Limitations & Future Work

  • The noise condition \(4\|E\| < \min\{\lambda_n, \delta_{n-p}\}\) may be restrictive in certain settings.
  • Only the spectral norm is considered; Frobenius norm bounds remain to be developed.
  • Structured noise (e.g., low-rank perturbations) may admit tighter bounds.
  • vs. Classical Neumann Series Bounds: Classical bounds do not account for low-rank truncation, and the second term \(2/\lambda_{n-p}\) does not vanish as \(\|E\| \to 0\).
  • vs. Schatten Norm Results: Existing work does not provide spectral norm guarantees.
  • vs. Sherman–Morrison–Woodbury: Applicable only to structured low-rank updates, not to general additive noise.

Rating

  • Novelty: ⭐⭐⭐⭐⭐ First non-asymptotic spectral norm bound for low-rank inverse perturbation, filling an important gap.
  • Experimental Thoroughness: ⭐⭐⭐⭐ Validated on diverse real-world and synthetic matrices with high tracking accuracy.
  • Writing Quality: ⭐⭐⭐⭐⭐ Rigorous and well-structured, with clear proof organization.
  • Value: ⭐⭐⭐⭐⭐ Foundational contribution to numerical computation and ML optimization.