3rd LoG NYC Workshop

Andrew Gordon Wilson

Affiliation

New York University

Talk Title

It's time to say goodbye to hard (equivariance) constraints

Abstract

Often we think of inductive biases as hard constraints on the size of the hypothesis space, aligned with a problem of interest. Popular examples include equivariance constraints, such as rotation and translation equivariance, or simple parameter sharing, or even constraints on the model size to avoid overfitting. In this talk, we will argue against this approach to model construction, instead in favour of soft inductive biases. This perspective of inductive biases helps resolve several generalization phenomena in deep learning, and can be rigorously formalized using countable hypothesis bounds. We find that soft biases are almost always preferable to hard constraints, even when we know the constraint perfectly describes our problem! We will demonstrate these ideas on scientific applications, including materials generation, molecular property prediction, dynamical systems, and reinforcement learning.

Bio

Andrew Gordon Wilson is a Professor at the Courant Institute of Mathematical Sciences and Center for Data Science at New York University. He is interested in developing a prescriptive foundation for building intelligent systems. His work includes loss landscapes, optimization, Bayesian model selection, equivariances, generalization bounds, and scientific applications.

Website

https://cims.nyu.edu/~andrewgw/