‘Equilibrium and optimization in Generative Adversarial Networks’ with Sixin Zhang

We are pleased to have Sixin Zhang, who is is currently Maitre de Conference at INP-Toulouse and the Toulouse Research Institute in Computer Science (IRIT), join us for this ART-AI seminar.

ART-AI Seminar

We are pleased to have Sixin Zhang, who is currently Maitre de Conference at INP-Toulouse and the Toulouse Research Institute in Computer Science (IRIT), join us for this ART-AI seminar entitled ‘Equilibrium and optimization in Generative Adversarial Networks’ at the University of Bath, 1W 2.01, on Tuesday 2nd July 2024, 12.15pm-13.15pm (BST). Georgios Exarchakis will chair.   If you are unable to make this event in-person, there is an option to dial in via Microsoft Teams. For more information, please e-mail [email protected].

Title:

Equilibrium and optimization in Generative Adversarial Networks

Abstract:

Training Generative Adversarial Networks (GANs) is challenging because the nature of equilibrium of the solution sets of existing min-max algorithms remains unclear. On the other hand, existing optimization algorithms work well in practice with a careful choice of hyper-parameters. This talk aims to give an overview of the topic and presents some recent results regarding the existence of Nash equilibrium in GANs, and some insights about how practical algorithms work.

The first part of the talk introduces GANs from the perspective of likelihood-free density estimation in statistics, and gives an overview of the associated 2-player game and some results on the non-existence of Nash equilibrium in practice. We then introduce the notion of consistent Nash equilibrium. We study the existence and uniqueness of consistent Nash or consistent non-Nash equilibrium in moment-matching GANs, by varying the discriminator family in the game of GANs for stationary Gaussian process. Three discriminator families are considered, based on a real-valued linear transform, complex-valued linear transform, and convolutional transform. We shall present theoretical and numerical results to show the rich structure of Nash equilibrium in these cases.

The second part of the talk is about optimization algorithms in GANs training. We study numerically the global convergence behaviour of simultaneous gradient descent ascent algorithm in the moment-matching GANs for stationary Gaussian process. We then discuss the training of Wasserstein GANs in connection with moment-matching GANs.

Bio:

Sixin Zhang is currently Maitre de Conference at INP-Toulouse and the Toulouse Research Institute in Computer Science (IRIT) in France. He received a Ph.D. degree at the Courant Institute of Mathematical Sciences, New York University, NY, USA. From 2016 to 2021, Sixin was a postdoctoral researcher at ENS Paris, then at Peking University, Beijing, China, as a research associate, and at CNRS, IRIT, Université de Toulouse as a postdoctoral researcher. 


Event Info

Date 02.07.2024
Start Time 12:15pm
End Time 1:15pm

Add to Google Calendar