Skip to main content
27Mar

Statistics and Data Science Seminar - Neural Networks Generalize on Low Complexity Data

Hosted by the Department of Statistics
COL 1.06
Friday 27 March 2026 2pm - 3pm

Abstract: We show that feedforward neural networks generalize on low complexity data, suitably defined.

Given data generated from a simple programming language, the minimum description length (MDL) feedforward neural network which interpolates the data generalizes with high probability. We define this simple programming language, along with a notion of description length of such networks. We provideseveral examples on basic computational tasks, such as checking the primality of a natural number. Extensions to noisy data are also discussed, suggesting that MDL neural network interpolators can demonstrate tempered overfitting. This is joint work with Sourav Chatterjee.

Bio: I'm a 5th year PhD student advised by Sourav Chatterjee interested in neural networks, empirical Bayes, and causal inference.

LSE holds a wide range of events, covering many of the most controversial issues of the day, and speakers at our events may express views that cause offence. The views expressed by speakers at LSE events do not reflect the position or views of the London School of Economics and Political Science.