# Advances on Metric Embeddings (FOCS 2022 Workshop)
Low distortion metric embeddings provide a powerful algorithmic toolkit, with applications ranging from approximation/sublinear/online/distributed algorithms, to machine learning , biology, and computer vision.
We say that an embedding $f$ from a metric space $(X,d_X)$ to a metric space $(Y,d_Y)$ has multiplicative distortion $t$, if the embedding is dominating for every pair of points $u,v\in X$ it holds that $\alpha\cdot d_X(u,v)\le d_Y(f(u),f(v))\le t\cdot \beta\cdot d_X(u,v)$ for some constans $\alpha,\beta$.
Typical applications of metric embeddings have the following structure: take some instance of a problem in a \"hard" metric space $(X,d_X)$; embed $X$ into a \"simple" metric space $(Y,d_Y)$ via a low-distortion metric embedding $f$; solve the problem in $Y$, and \"pull-back" the solution to $X$. Thus, the objectives are low distortion and \"simple" target space.
The two notable metric embedding workshops, one organized in Haifa, Israel in 2002 and another organized in Princeton in 2003, have been very successful till this day: they introduced metric embedding to the broader theory community and provided a host of open problems that have served as an excellent guidance for advancing the field much further.
Over the last two decades, significant progress has been made on many open problem put forth by the 2002-2003 workshops At the same time, many new research directions in metric embedding have emerged. Here we highlight several examples. These developments have lead to new algorithm design techniques as well as a deeper understanding of a variety of metric spaces. On the other hand, they opened up a host of challenging problems. The goal of our workshop is to bring the forefronts in the area of metric embeddings to the broader theory community. By doing so, we hope that researchers in different areas could exploit the new ideas and techniques in metric embedding developed over the past two decades to solve their problems.
## Organizers
[Arnold Filtser](https://arnold.filtser.com/) (Bar-Ilan University)
[Hung Le](https://hunglvosu.github.io/) (University of Massachusetts Amherst)
## Location and Registration
The workshop will be at [FOCS 2022](https://focs2022.eecs.berkeley.edu/program.html) and split into two days, Oct 31 and Nov 01. The registration is done though [FOCS](https://focs2022.eecs.berkeley.edu/).
## Open questions and workshop dinner
Instead of holding an open question session, participants are asked to upload thier open questions to an [Overleaf file](https://www.overleaf.com/6375432811frhnyyrttqcb
). Discussion of open question will be held informally throuout FOCS conference, and in an informal dinner (see details bellow). Feel free to continue updating this open problems file after the conference end.
We plan to have an (informal) dinner at the end on the first day of the workshop (18:00, OCT 31), in order to socialize and discuss open problems. We plan to go to [Corinne Restaurant](https://www.corinnedenver.com/) (1455 California St, Denver, CO 80202) which is about 6 minutes walking from the FOCS venue (Hilton Denver City Center, Denver). If you are interested in joining us, please let us know by putting your name [here](https://docs.google.com/spreadsheets/d/1vgLPb0iTSjKnBemxmMxWDVjbhxenYJdBx8iiulvKMTQ/edit?usp=sharing) so that we can plan table reservation.
## Talk Schedule
All times are in Mountain Standard Time
### Session 1 (11-12:30, Oct 31, [Video](https://vimeo.com/user39621409/review/771380648/2f555592ba))
- 11:00-11:15 **[Arnold Filtser](https://arnold.filtser.com/) & [Hung Le](https://hunglvosu.github.io/)**
* Introduction
- 11:15-12:00 **[Arnold Filtser](https://arnold.filtser.com/)** (Bar-Ilan University)
* **Title**: *Metric Embeddings into Trees*
* **Abstract**: In low distortion metric embeddings, the goal is to embed a host "hard" metric space into a "simpler" target space while approximately preserving pairwise distances. A highly desirable target space is that of a tree metric. Unfortunately, such embedding will result in a huge distortion. A celebrated bypass to this problem is stochastic embedding with logarithmic expected distortion. Another bypass is Ramsey-type embedding, where the distortion guarantee applies only to a subset of the points. <br><br> In this talk we will discuss a novel third bypass called clan embedding. Here each point $x$ is mapped to a subset of points, called a clan, with a special chief point. The distortion guarantee is that for every pair of points $x,y$, the distance is approximately preserved between some vertex in the clan of $y$, to the chief of $x$. Note that the guarantee is worst-case, and the object of study is the trade-off between the distortion to the expected number of vertices in each clan. <br><br> The $h$-hop distance between a pair of vertices is the minimum weight path between them, containing at most $h$ edges. While this distance measure is clearly not a metric space, surprisingly, it is still possible to embed it into trees with Ramsey type and clan guarantees. We will discuss these embeddings, and their applications
- 12:00-12:30 **[Nikhil Kumar](https://hpi.de/friedrich/people/nikhil-kumar.html)** (Hasso Plattner Institute)
* **Title**: *Planar Embedding Conjecture: A Survey*
* **Abstract**: Metric embedding techniques have proved to be of fundamental importance in designing algorithms for many combinatorial optimization problems. One of the fundamental open questions in this area is whether the shortest-path metric on a planar graph embeds into $\ell_1$ with $O(1)$ distortion. This is equivalent to such graphs having an $O(1)$-approximate multi-flow/min-cut theorem. We will survey the known results and techniques in this line of research and also discuss some open problems.
### Session 2 (16:00-17:30, Oct 31, [Video](https://vimeo.com/user39621409/review/771368740/bd762c9307))
- 16:00-16:45 **[Robert Krauthgamer](https://www.wisdom.weizmann.ac.il/~robi/)** (Weizmann Institute of Science)
* **Title**: Data vs Dimension Reduction in High-Dimensional Streams
* **Abstract**: Many streaming algorithms for high-dimensional data rely on metric embeddings to effectively "compress" the input. Assuming for concreteness that the input is a set of n points in $R^d$, one popular approach, called dimension reduction, is to map the input points to low dimension $d'$ and solve the resulting low-dimensional instance. A second approach, called data reduction, replaces the $n$ input points with a small number $n'$ of points (perhaps with multiplicities), still in $R^d$. These methods reduce the overall size of the instance from $O(nd)$ to $O(nd')$ or $O(n'd)$. Yet another approach is to map the input into a tree metric, which distorts the distances signficantly, but produces an instance that is easy to solve. <br><br> The talk will survey these 3 approaches and demonstrate their effectiveness (and limitations) for streaming algorithms that solve different clustering problems. The focus will be more on metric embeddings and related algorithmic techniques (and less on streaming machinery), with an eye towards future challenges. <br><br> Based on past and ongoing work with Yu Chen, Artur Czumaj, Shaofeng H.-C. Jiang, Pavel Vesely and Mingwei Yang.
- 16:45-17:30 **[Greg Bodwin](https://bodwin.engin.umich.edu/)** (University of Michigan)
* **Title**: *Reductions between Spanners, Preservers, Hopsets, and Friends*
* **Abstract**: Some of the foundational results in metric embedding and graph sketching can be phrased as an "extremal reduction" from one problem to another. A famous example is the reduction from the extremal size of spanners and distance oracles to the extremal size of high-girth graphs [Althöfer et al '93]. In the last 5-10 years, a research program has emerged in which the goal is to expand our network of extremal reductions, relating important questions in metric embedding to each other, and extracting basic combinatorial questions at the heart of the area. This has touched areas like light spanners, distance and reachability preservers, shortcut sets, hopsets, and more. We will survey recent progress on this program, we will discuss what the implications of these results for the future of the area, and we will conclude with some open questions.
### Session 3 (11-12:30, Nov 01, [Video](https://vimeo.com/user39621409/review/771347354/3fbf0824c1))
- 11:00-11:45 **[Yeshwanth Cherapanamjeri](https://yeshwanth94.github.io/)** (UC Berkeley)
* **Title**: *Towards Adaptive Metric Embeddings*
* **Abstract**: Dimensionality reduction techniques like the Johnson-Lindenstrauss lemma have been a mainstay of computational geometry with applications ranging from databases to machine learning. However, a fundamental limitation of these methods uncovered by recent work is their susceptibility to adaptively chosen inputs. In this talk, we will overview progress towards alleviating some of these drawbacks -- we will cover the construction and an efficient algorithmic implementation of Terminal Embeddings, an adaptive variant of the Johnson-Lindenstrauss lemma and finally, conclude with open questions and directions for future work.
- 11:45-12:30 **[Erik Waingarten](https://sites.google.com/site/erikwaing/home)** (Stanford University)
* **Title**: *The Johnson-Lindenstrauss Lemma for Clustering and Subspace Approximation*
* **Abstract**: The Johnson-Lindenstrauss lemma says that for any set of vectors in a high-dimensional space, there exists an embedding into a much lower dimensional space which approximately preserves all pairwise distances. Here, we explore dimensionality reduction for other geometric optimization problems: given a geometric optimization problem (for example, $k$-means clustering or principal components analysis) is there always an embedding to a lower dimensional space which approximately preserves the cost of the optimization? In this talk, I will overview a few results in this space, and present a new technique for using coreset constructions in order to get improved dimensionality reduction. Joint with Moses Charikar (<a href = "https://arxiv.org/abs/2205.00371">https://arxiv.org/abs/2205.00371</a>)
### Session 4 (16:00-17:30, Nov 01, [Video](https://vimeo.com/user39621409/review/771333384/dfe1ed17aa))
- 16:00-16:45 **[Anastasios Sidiropoulos](https://sidiropo.people.uic.edu/)** (University of Illinois at Chicago)
* **Title**: *Quasimetric Embeddings and their Applications in Directed Graph Problems*
* **Abstract**: The talk will review some recent results that extend theorems from bi-Lipschitz metric embedding theory, to the quasi-metric case. Quasi-metric spaces are distance spaces that do not need to satisfy symmetry. The bi-Lipzchitz quasi-metric embedding results give rise to new approximation algorithms for directed graph problems, such as Directed Sparsest-Cut. The talk will also discuss some open problems.
- 16:45-17:30 **[Hung Le](https://hunglvosu.github.io/)** (University of Massachusetts Amherst)
* **Title**: *Embedding Topological Graphs*
* **Abstract**: Traditional metric embedding focuses on finding an embedding with small multiplicative distortion. However, the search for this kind of embedding for topological graphs, such as planar graphs or minor-free graphs, has not been very fruitful. Indeed, most known results for multiplicative embeddings of planar graphs are negative. For example, any embedding into graphs with constant treewidth requires logarithmic (multiplicative) distortion. On the other hand, recent research has shown that this lower bound can be bypassed by relaxing the distortion to be additive. For example, one can embed any planar graph into a constant treewidth graph with a small additive distortion. In this talk, I will focus on finding embeddings of topological graphs with additive distortion and their algorithmic applications. I will also mention various open problems along the way.
# Other Resources
- [Lecture notes on metric embeddings](https://kam.mff.cuni.cz/~matousek/ba-a4.pdf) by Jiří Matoušek 2013.
- [Open problems on embeddings of finite metric spaces](https://kam.mff.cuni.cz/~matousek/metrop.ps) from the 2002 and 2003 workshops.
- [Chapter](http://www.csun.edu/~ctoth/Handbook/chap8.pdf) in the Handbook of Discrete and Computational Geometry (2017): Low-distortion embeddings of finite metric spaces by Piotr Indyk, Jiří Matoušek, and Anastasios Sidiropoulos.
- Previous workshops:
- [Haifa workshop 2002](https://cs.haifa.ac.il/~yuri/DMSA02.html) on Discrete Metric Spaces and their Algorithmic Applications
- [Princton (DIMACS) workshop 2003](http://archive.dimacs.rutgers.edu/Workshops/MetricSpaces/) on Discrete Metric Spaces and their Algorithmic Application
- [Northwestern (IDEAL) workshop 2018 ](https://theory.cs.northwestern.edu/events/embeddings/) on Metric Embeddings and Dimensionality Reduction
- Courses:
- [Metric Embeddings and Algorithmic Applications](https://web.stanford.edu/class/cs369m/) by Moses Charikar 2018.
- [Metric Embeddings and Lipschitz Extensions](https://web.math.princeton.edu/~naor/homepage%20files/embeddings_extensions.pdf) notes of Assaf Naor lectures 2015.
- [Theory of Metric Embeddings](https://home.ttic.edu/~harry/teaching/teaching.html) by Harald Räcke 2006.
- [Metric Embeddings and Algorithmic Applications](http://timroughgarden.org/w06b/w06b.html) by Tim Roughgarden 2006.
- [Metric Embeddings](http://www.cs.toronto.edu/~avner/teaching/S6-2414/index.html) by Avner Magen 2006.
- [Algorithmic Applications of Metric Embeddings](https://www.cs.cmu.edu/~anupamg/metrics/) by Anupam Gupta and R. Ravi 2004.
- [Open problems on embeddings of finite metric spaces](https://kam.mff.cuni.cz/~matousek/metrop.ps) from the 2002 and 2003 workshops.