Skip to content

An implementation of Generative Adversarial Network to generate different 5 gaussian distributions

License

Notifications You must be signed in to change notification settings

dhyaaalayed/gan-gaussian

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 

Repository files navigation

gan-gaussian

An implementation for Vanilla GAN to generate different 5 gaussian distributions vanilla_gan_frames

Only four distributions have been learned by GAN not all of them. This failure called Mode Collapse. Unlike WGAN which doesn't collapse when the training data contains many distributions. See WGAN repository

Content:

  • GAN Architecture
  • Generator Architecture
  • Discriminator Architecture
  • How to train the model

Model Overview

GAN consists of a generative model called Generator and a discriminative model called Discriminator

image

The Generator takes a random vector Z as input to output new data. The Discriminator tries to distinguish between the generated data by the generator and the real data Our goal is to train the Generator to generate fake data looks like the real data until the discriminator will not be able to distinguish between the real data and the fake data Both of them try to Faul the other like Minmax game

image

G tries to maximize the probabilty of the fake data D tries to minimize the probabilty of the fake data and maximize the probabilty of the real data

Generator Architecture

It consists of an input layer of 2 neurons for the z vector, 3 hidden layers of 512 neurons and an output layer of 2 neurons activation functions of the 3 hidden layers are Relus and linear for the output layer

image

Discriminator Architecture

it consists of an input layer of 2 neurons for the training data, 3 hidden layers of 512 neurons of Relu activation function and an output layer of 1 neuron of sigmoid activation function

image

Generator loss function

Generator tries to maximize the probability of the generated data

image

Tensorflow code of the generator loss function:
g_loss = tf.reduce_mean(tf.nn.sigmoid_cross_entropy_with_logits(logits=D_logit_fake, labels=tf.ones_like(D_logit_fake)))

Discriminator loss function

Discriminator tries to minimize the probability of the generated data and to maximize the probability of the real data

image

Tensorflow code of the discriminator loss function:
D_loss_real = tf.reduce_mean(tf.nn.sigmoid_cross_entropy_with_logits(logits=D_logit_real, labels=tf.ones_like(D_logit_real)))
D_loss_fake = tf.reduce_mean(tf.nn.sigmoid_cross_entropy_with_logits(logits=D_logit_fake, labels=tf.zeros_like(D_logit_fake)))
d_loss = D_loss_real + D_loss_fake

How to use the code:

Write in the console python vanilla_gan.py to train the model for generating 5 Gaussian distributions. The results will be saved for each epoch in the tf_vanilla_gan_results folder

About

An implementation of Generative Adversarial Network to generate different 5 gaussian distributions

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages