Technical Development - Triple Distilled

Triple Distilled

At the heart of a functional GAN is two artificial neural networks being able to successfully compete. If you are not careful you can construct a very unfair game. The generator will often find it impossible to compete with an advanced discriminator, especially with high definition images. There will be clear signs of this in your log metrics and the art snapshots that are generated. 

To overcome this common problem - a staged or progressive approach to training is recommended. I allow the generator to develop and harden before it needs to be exposed to a Grand Master discriminator.   I interrupt the training to tweak hyper-parameters and the training sets. I found 3 stages worked well enough - providing some assistance to the generator without too much disruption. I like the number 3 and it also allows me to say that my models are tripled distilled - like fine whiskey.

Artwork For Sale