Metric embeddings are a widely used method in algorithm design, where generally a ``complex'' metric is embedded into a simpler, lower-dimensional one. Historically, the theoretical computer science community has focused on bi-Lipschitz embeddings, which guarantee that every pairwise distance is approximately preserved. In contrast, alternative embedding objectives that are commonly used in practice avoid bi-Lipschitz distortion; yet these approaches have received comparatively less study in theory. In this paper, we focus on Multi-dimensional Scaling (MDS), where we are given a set of non-negative dissimilarities $\{d_{i,j}\}_{i,j\in [n]}$ over $n$ points, and the goal is to find an embedding $\{x_1,\dots,x_n\} \subset R^k$ that minimizes $$\textrm{OPT}=\min_{x}\mathbb{E}_{i,j\in [n]}\left(1-\frac{\|x_i - x_j\|}{d_{i,j}}\right)^2.$$ Despite its popularity, our theoretical understanding of MDS is extremely limited. Recently, Demaine et. al. (arXiv:2109.11505) gave the first approximation algorithm with provable guarantees for this objective, which achieves an embedding in constant dimensional Euclidean space with cost $\textrm{OPT} +\epsilon$ in $n^2\cdot 2^{\textrm{poly}(\Delta/\epsilon)}$ time, where $\Delta$ is the aspect ratio of the input dissimilarities. For metrics that admit low-cost embeddings, $\Delta$ scales polynomially in $n$. In this work, we give the first approximation algorithm for MDS with quasi-polynomial dependency on $\Delta$: for constant dimensional Euclidean space, we achieve a solution with cost $O(\log \Delta)\cdot \textrm{OPT}^{\Omega(1)}+\epsilon$ in time $n^{O(1)} \cdot 2^{\text{poly}((\log(\Delta)/\epsilon))}$. Our algorithms are based on a novel geometry-aware analysis of a conditional rounding of the Sherali-Adams LP Hierarchy, allowing us to avoid exponential dependency on the aspect ratio, which would typically result from this rounding.
Metric Embeddings Beyond Bi-Lipschitz Distortion via Sherali-Adams
Ainesh Bakshi,Vincent Cohen-Addad,Rajesh Jayaram,Sam Hopkins,Silvio Lattanzi
Published 2023 in Annual Conference Computational Learning Theory
ABSTRACT
PUBLICATION RECORD
- Publication year
2023
- Venue
Annual Conference Computational Learning Theory
- Publication date
2023-11-29
- Fields of study
Mathematics, Computer Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-49 of 49 references · Page 1 of 1
CITED BY
Showing 1-1 of 1 citing papers · Page 1 of 1