Sparse Training from Random Initialization: Aligning Lottery Ticket Masks using Weight Symmetry with Yani Ioannou

We are pleased to have Yani Ioannou, an Assistant Professor and Schulich Research Chair in the Department of Electrical and Software Engineering of the Schulich School of Engineering, at the University of Calgary in Alberta, Canada, join us for this ART-AI seminar on the 30th September 2025.

ART-AI Seminar

We are pleased to have Yani Ioannou, an Assistant Professor and Schulich Research Chair in the Department of Electrical and Software Engineering of the Schulich School of Engineering, at the University of Calgary in Alberta, Canada, join us for this ART-AI seminar entitled ‘Sparse Training from Random Initialization: Aligning Lottery Ticket Masks using Weight Symmetry’.

This seminar will take place in person in 1 W 2.01, on Tuesday 30th September 2025, 12.15pm-13.15pm (GMT). There is also an option to join online. Rohit Babbar will be chairing this seminar. For more information, please e-mail [email protected].

Title

Sparse Training from Random Initialization: Aligning Lottery Ticket Masks using Weight Symmetry

Abstract

The Lottery Ticket Hypothesis (LTH) suggests there exists a sparse LTH mask and weights that achieve the same generalization performance as the dense model while using significantly fewer parameters. However, finding a LTH solution is computationally expensive, and the LTH sparsity mask does not generalize to other random weight initializations. Recent work has suggested that neural networks trained from random initialization find solutions within the same basin modulo permutation and proposes a method to align trained models within the same loss basin. We hypothesize that misalignment of basins is the reason why sparse masks do not generalize to new random initializations and propose permuting the LTH mask to align with the new optimization basin when performing sparse training from a different random init. We empirically show a significant increase in generalization when sparse training from random initialization with the permuted mask as compared to using the non-permuted LTH mask, on multiple datasets and models.

Bio

Yani Ioannou is an Assistant Professor and Schulich Research Chair in the Department of Electrical and Software Engineering of the Schulich School of Engineering, at the University of Calgary in Alberta, Canada. Yani was previously a Visiting Researcher at Google Brain Toronto (DeepMind) with Geoffrey Hinton, and a Post-doctoral Fellow at the Vector Institute with Dr. Graham Taylor. Yani completed his PhD at the University of Cambridge in 2018 supported by a Microsoft Research Ph.D. Scholarship, where he was supervised by Professor Roberto Cipolla and Dr. Antonio Criminisi.


Event Info

Date 30.09.2025
Start Time 12:15pm
End Time 1:15pm

Add to Google Calendar