by kam1107

kam1107 / RealnessGAN

Code for ICLR2020 paper 'Real or Not Real, that is the Question'

248 Stars 32 Forks Last release: Not found Other 24 Commits 0 Releases

Available items

No Items, yet!

The developer of this repository has not created any items for sale yet. Need a bug fixed? Help with integration? A different license? Create a request here:


This repository contains the code of the following paper:

Real or Not Real, that is the Question
Yuanbo Xiangli, Yubin Deng, Bo Dai*, Chen Change Loy, Dahua Lin
paper & talk

Abstract: While generative adversarial networks (GAN) have been widely adopted in various topics, in this paper we generalize the standard GAN to a new perspective by treating realness as a random variable that can be estimated from multiple angles. In this generalized framework, referred to as RealnessGAN, the discriminator outputs a distribution as the measure of realness. While RealnessGAN shares similar theoretical guarantees with the standard GAN, it provides more insights on adversarial learning. More importantly, compared to multiple baselines, RealnessGAN provides stronger guidance for the generator, achieving improvements on both synthetic and real-world datasets. Moreover, it enables the basic DCGAN architecture to generate realistic images at 10241024 resolution when trained from scratch.


  • Watch the full demo video in YouTube


Experiments were conducted on two real-world datasets: CelebA and FFHQ; and a toy dataset: Mixture of Gaussians.


  • Python 3.6
  • Pytorch 1.1.0

Pretrain Models


  • Either use the aforementioned dataset or prepare your dataset.
  • Scripts to run experiments are stored in /scripts/*.sh.
  • Edit folder locations in your scripts. Make sure the folders to store LOG and OUTPUT are created.
  • Run


CelebA 256x256 (FID = 23.51)

FFHQ 1024x1024 (FID = 17.18)


  title={Real or Not Real, that is the Question},
  author={Xiangli, Yuanbo and Deng, Yubin and Dai, Bo and Loy, Chen Change and Lin, Dahua},
  booktitle={International Conference on Learning Representations},

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.