2. Good Semi-supervised Learning That Requires a Bad GAN
Feature Matching (FM ) GAN
Z. Dai, Z. Yang, F. Yang, W. W. Cohen, and R. Salakhutdinov. Good semi-supervised learning that requires a bad GAN. In NIPS, 2017.
Tim Salimans, Ian Goodfellow, Wojciech Zaremba, Vicki Cheung, Alec Radford, and Xi Chen. Improved techniques for training gans. In NIPS, 2016.
Match the first-order feature statistics between the generator distribution and the true distribution.
f(x) denotes activations on an intermediate layer of the discriminator
3. Good Semi-supervised Learning That Requires a Bad GAN
Feature Matching (FM ) GAN
Z. Dai, Z. Yang, F. Yang, W. W. Cohen, and R. Salakhutdinov. Good semi-supervised learning that requires a bad GAN. In NIPS, 2017.
Tim Salimans, Ian Goodfellow, Wojciech Zaremba, Vicki Cheung, Alec Radford, and Xi Chen. Improved techniques for training gans. In NIPS, 2016.
Instead of binary classification, the discriminator employs a (K + 1)-class objective:
true samples are classified into the first K classes and generated samples are classified into the (K + 1)-th class.
Semi-supervised Learning
Match the first-order feature statistics between the generator distribution and the true distribution.
f(x) denotes activations on an intermediate layer of the discriminator
4. Motivation
(1) it is not clear why the formulation of the discriminator can improve the performance when combined with a generator;
(2) it seems that good semi-supervised learning and a good generator cannot be obtained at the same time.
Good Semi-supervised Learning That Requires a Bad GAN
5. Motivation
(1) it is not clear why the formulation of the discriminator can improve the performance when combined with a generator;
(2) it seems that good semi-supervised learning and a good generator cannot be obtained at the same time.
The model generated better images but failed to improve the performance on semi-supervised learning.
Good Semi-supervised Learning That Requires a Bad GAN
6. Motivation
(1) it is not clear why the formulation of the discriminator can improve the performance when combined with a generator;
(2) it seems that good semi-supervised learning and a good generator cannot be obtained at the same time.
Good semi-supervised learning requires a “bad” generator.
Bad : the generator distribution should not match the true data distribution
Good Semi-supervised Learning That Requires a Bad GAN
7. Theoretical Analysis — Perfect Generator
(1) it is not clear why the formulation of the discriminator can improve the performance when combined with a generator;
(2) it seems that good semi-supervised learning and a good generator cannot be obtained at the same time.
Labeled data Unlabeled data Generated data
Good Semi-supervised Learning That Requires a Bad GAN
8. Theoretical Analysis — Perfect Generator
(1) it is not clear why the formulation of the discriminator can improve the performance when combined with a generator;
(2) it seems that good semi-supervised learning and a good generator cannot be obtained at the same time.
Labeled data Unlabeled data Generated data
Good Semi-supervised Learning That Requires a Bad GAN
9. Theoretical Analysis — Perfect Generator
(1) it is not clear why the formulation of the discriminator can improve the performance when combined with a generator;
(2) it seems that good semi-supervised learning and a good generator cannot be obtained at the same time.
Good Semi-supervised Learning That Requires a Bad GAN
when the generator is perfect, the (K + 1)-class objective by itself is not able to
improve the generalization performance.
10. Theoretical Analysis — Complement Generator
(1) it is not clear why the formulation of the discriminator can improve the performance when combined with a generator;
(2) it seems that good semi-supervised learning and a good generator cannot be obtained at the same time.
Good Semi-supervised Learning That Requires a Bad GAN
Mikhail Belkin, Partha Niyogi, and Vikas Sindhwani. Manifold regularization: A geometric framework for learning from labeled and
unlabeled examples. Journal of machine learning research, 7(Nov):2399–2434, 2006.
the discriminator obtains class boundaries in low-density areas
low-density boundary assumption
11. Theoretical Analysis
(1) it is not clear why the formulation of the discriminator can improve the performance when combined with a generator;
(2) it seems that good semi-supervised learning and a good generator cannot be obtained at the same time.
Good Semi-supervised Learning That Requires a Bad GAN
Complement Generator
Perfect Generator
the generated complement samples encourage the discriminator to place the class boundaries in low-density areas
good semi-supervised learning indeed requires a bad generator because a perfect generator is not able to improve
the generalization performance
13. Meta-training the hallucinator
Train classifier
Train generator
Low-shot Learning from Imaginary Data
14. Low-shot Learning from Imaginary Data
This goal differs from realism: realistic examples might still fail to
capture the many modes of variation of visual concepts, while
unrealistic hallucinations can still lead to a good decision boundary