From the course: AI Workshop: Hands-on with GANs with Deep Convolutional Networks

Unlock this course with a free trial

Join today to access over 23,300 courses taught by industry experts.

Setting up the GAN training loop

Setting up the GAN training loop

We are now ready to train our deep convolutional generative adversarial network. While training a GAN, the generator network and the discriminator network have to be trained together and we'll train them alternately in the same iteration. Let's set up some parameters for training. The loss function that we'll use is the binary cross entropy loss function, which will essentially mimic the minimax loss used to train a generative adversarial network. We'll set up the loss functions in such a way that the discriminator will maximize the probability of classifying real images as real and fake images as fake, whereas the generator will maximize the probability of having the discriminator classify fake images as real. I've initialized a batch of 64 fixed noise latent variables. Now, every so often during the training of the generator, we'll look at some sample images to see how the generator is performing at that point in the training. We'll use this fixed noise that I've set up here to…

Contents