Overclocking Electrostatic Generative Models

Abstract

Electrostatic generative models such as PFGM have recently emerged as a powerful framework, achieving state-of-the-art performance in image synthesis. PFGM operates in an extended data space with auxiliary dimensionality D, recovering the diffusion model framework as D tends to infinity, while yielding superior empirical results for finite D.Like diffusion models, PFGM relies on expensive ODE simulations to generate samples,making it computationally costly. To address this, we propose Inverse Poisson Flow Matching (IPFM), a novel distillation framework that accelerates electrostatic generative models across all values of D. Our IPFM reformulates distillation as an inverse problem – learning a generator whose induced electrostatic field matches that of the teacher. We derive a tractable training objective for this problem and show that, as D tends to infinity, our IPFM closely recovers Score Identity Distillation (SiD), a recent method for distilling diffusion models. Empirically, our IPFM produces distilled generators that achieve near-teacher or even superior sample quality using only a few function evaluations. Moreover, we observe that distillation converges faster for finite D than in the infinite (diffusion) limit, which is consistent with prior findings that finite-D PFGM models exhibit more favorable optimization and sampling properties.

Publication
A pre-print
Alexander Korotin
Alexander Korotin
Researcher

My research interests include generative modeling, diffusion models, unpaired learning, optimal transport and Schrodinger bridges.