How does the Wasserstein distance improve the stability and quality of GAN training compared to traditional divergence measures like Kullback-Leibler (KL) divergence and Jensen-Shannon (JS) divergence?
Tuesday, 11 June 2024
by EITCA Academy
Generative Adversarial Networks (GANs) have revolutionized the field of generative modeling by enabling the creation of highly realistic synthetic data. However, training GANs is notoriously difficult, primarily due to issues related to stability and convergence. Traditional divergence measures such as Kullback-Leibler (KL) divergence and Jensen-Shannon (JS) divergence have been commonly used to guide the training
- Published in Artificial Intelligence, EITC/AI/ADL Advanced Deep Learning, Generative adversarial networks, Advances in generative adversarial networks, Examination review
Tagged under:
Artificial Intelligence, GAN, JS Divergence, KL Divergence, Wasserstein Distance, WGAN

