Uncovering Challenges of Solving the Continuous Gromov-Wasserstein Problem

Abstract

Learning conditional distributions is a central problem in machine learning, which is typically approached via supervised methods with paired data (x,y). However, acquiring paired data samples is often challenging, especially in problems such as domain translation. This necessitates the development of semi-supervised models that utilize both limited paired data and additional unpaired i.i.d. samples x and y from the marginal distributions. The usage of such combined data is complex and often relies on heuristic approaches. To tackle this issue, we propose a new learning paradigm that integrates both paired and unpaired data seamlessly using data likelihood maximization techniques. We demonstrate that our approach also connects intriguingly with inverse entropic optimal transport (OT). This finding allows us to apply recent advances in computational OT to establish a light learning algorithm to get conditional distributions of y given x. Furthermore, we demonstrate through empirical tests that our method effectively learns conditional distributions using paired and unpaired data simultaneously.

Publication
A pre-print
Alexander Korotin
Alexander Korotin
Assistant professor,
senior research scientist

My research interests include generative modeling, unpaired learning, optimal transport and Schrodinger bridges.