This document discusses the connections between generative adversarial networks (GANs) and energy-based models (EBMs). It shows that GAN training can be interpreted as approximating maximum likelihood training of an EBM by replacing the intractable data distribution with a generator distribution. Specifically:
1. GANs train a discriminator to estimate the energy function of an EBM, with the generator minimizing that energy of its samples.
2. EBM training can be seen as alternatively updating the generator and sampling from it, in a manner similar to contrastive divergence for EBMs.
3. This perspective unifies GANs and EBMs, and suggests ways to combine their training procedures to leverage their respective advantages
2. ֓ཁ
ҎԼͷ3ຊͷจΛϕʔεʹɺGANͱEBMͷؔʹ͍ͭͯ·ͱΊΔ
Deep Directed Generative Models with Energy-Based Probability Estimatio
n
https://arxiv.org/abs/1606.03439
Maximum Entropy Generators for Energy-Based Model
s
https://arxiv.org/abs/1901.08508
Your GAN is Secretly an Energy-based Model and You Should use Discriminator Driven
Latent Samplin
g
https://arxiv.org/abs/2003.06060
21. Taesup Kim,Yoshua Bengio (Université de Montréal)
Deep Directed Generative Models
with Energy-Based Probability Estimation
https://arxiv.org/abs/1606.03439
37. Tong Che,Ruixiang Zhang,Jascha Sohl-Dickstein,Hugo Larochelle,Liam Paull,Yuan Cao,Yoshua Bengio (Université de Montréal,Google Brain)
Your GAN is Secretly an Energy-based Model
and You Should Use Discriminator Driven
Latent Sampling
https://arxiv.org/abs/2003.06060