Lightweight Transformer for Image Interpolation Via Unrolling of Multiple Learned Graph Laplacian Regularizers

Tam Thuc Do,Parham Eftekhar,Gene Cheung

Published 2025 in 2025 IEEE International Conference on Image Processing Workshops (ICIPW)

ABSTRACT

We build an interpretable and lightweight transformer-like neural net by unrolling an iterative algorithm that minimizes multiple realizations of the quadratic graph Laplacian regularizer (GLR), subject to an interpolation constraint. The pivotal insight is that a normalized signal-dependent graph learning module amounts to a variation of the self-attention mechanism in conventional transformers. Unlike “blackbox” transformers that require learning of large key, query and value matrices to compute transformed dot products as affinities and output embeddings, we employ shallow CNNs to learn low-dimensional features per pixel to establish pairwise Mahalanobis distances and construct sparse similarity graphs. At each layer, given a learned graph, the target interpolated signal is simply a low-pass filtered output derived from the minimization of GLRs, resulting in a steep reduction in parameter count. Image interpolation experiments demonstrate competitive restoration performance and notable parameter reduction compared to mainstream transformers.

PUBLICATION RECORD

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-25 of 25 references · Page 1 of 1

CITED BY

  • No citing papers are available for this paper.

Showing 0-0 of 0 citing papers · Page 1 of 1