Alexander Korotin
Alexander Korotin
Home
Publications
Light
Dark
Automatic
paper-conference
A Statistical Learning Perspective on Semi-dual Adversarial Neural Optimal Transport Solvers
We provide theoretical generalization error bounds for neural network-based Optimal Transport (OT) solvers using minimax approaches, focusing on quadratic OT, and show these bounds depend on standard properties of neural networks, offering a foundation for extending analysis to broader OT formulations.
Roman Tarasov
,
Petr Mokrov
,
Milena Gazdieva
,
Evgeny Burnaev
,
Alexander Korotin
PDF
Cite
OpenReview
Diffusion and adversarial schrodinger bridges via iterative proportional markovian fitting
We discover that the existing heuristical bi-directional implementation of Iterative Markovian fitting (IMF) procedure for the Schrödinger Bridge problem secretly utilizes the other well-celebrated Iterative Proportional fitting procedure. We analyze such a cross-over of two procedures both theoretically and practically.
Sergei Kholkin
,
Grigory Ksenofontov
,
David Li
,
Nikita Kornilov
,
Nikita Gushchin
,
Alexandra Suvorikova
,
Alexey Kroshnin
,
Evgeny Burnaev
,
Alexander Korotin
PDF
Cite
OpenReview
Entering the Era of Discrete Diffusion Models: A Benchmark for Schrödinger Bridges and Entropic Optimal Transport
This paper introduces a benchmark for the Schrödinger bridge problem on discrete spaces, providing pairs of distributions with known solutions to enable rigorous evaluation of solvers, and also presents new algorithms as a byproduct.
Xavier Aramayo Carrasco
,
Grigory Ksenofontov
,
Aleksei Leonov
,
Iaroslav Koshelev
,
Alexander Korotin
PDF
Cite
OpenReview
InfoBridge: Mutual Information estimation via Bridge Matching
We introduce a novel mutual information (MI) estimator leveraging diffusion bridge models, which provides unbiased estimates for challenging data and demonstrates strong performance on standard MI estimation benchmarks.
Sergei Kholkin
,
Ivan Butakov
,
Evgeny Burnaev
,
Nikita Gushchin
,
Alexander Korotin
PDF
Cite
OpenReview
Interaction Field Matching: Overcoming Limitations of Electrostatic Models
Interaction Field Matching (IFM) generalizes Electrostatic Field Matching (EFM) by employing general interaction fields, including a quark-inspired solution to modeling challenges of EFM.
Manukhov Stepan
,
Alexander Kolesov
,
Vladimir Palyulin
,
Alexander Korotin
PDF
Cite
OpenReview
Learning of Population Dynamics: Inverse Optimization Meets JKO Scheme
iJKOnet combines the JKO scheme with inverse optimization to learn population dynamics without restrictive architectural constraints, achieving better performance than previous JKO-based methods.
Mikhail Persiianov
,
Jiawei Chen
,
Petr Mokrov
,
Alexander Tyurin
,
Evgeny Burnaev
,
Alexander Korotin
PDF
Cite
OpenReview
Universal Inverse Distillation for Matching Models with Real-Data Supervision (No GANs)
RealUID is a universal distillation framework that accelerates all matching models by incorporating real data into the distillation process without the need for GANs.
Nikita Kornilov
,
David Li
,
Tikhon Mavrin
,
Aleksei Leonov
,
Nikita Gushchin
,
Evgeny Burnaev
,
Iaroslav Koshelev
,
Alexander Korotin
PDF
Cite
OpenReview
Categorical Schrodinger Bridge Matching
The paper develops a theoretical and algorithmic foundation for solving the Schrödinger Bridge problem in discrete spaces using Iterative Markovian Fitting, introducing the CSBM algorithm and validating it with experiments on synthetic data and vector-quantized images.
Grigory Ksenofontov
,
Alexander Korotin
PDF
Cite
OpenReview
Field Matching: an Electrostatic Paradigm to Generate and Transfer Data
We propose Electrostatic Field Matching (EFM), a novel method inspired by the physics of capacitors, which uses a neural network to learn electrostatic fields for generative modeling and distribution transfer.
Alexander Kolesov
,
Manukhov Stepan
,
Vladimir Palyulin
,
Alexander Korotin
PDF
Cite
OpenReview
Inverse Bridge Matching Distillation
We introduce a novel distillation technique for diffusion bridge models (DBMs) that significantly accelerates inference (4x to 100x) and improves generation quality by leveraging inverse bridge matching.
Nikita Gushchin
,
David Li
,
Daniil Selikhanovych
,
Evgeny Burnaev
,
Dmitry Baranchuk
,
Alexander Korotin
PDF
Cite
OpenReview
»
Cite
×