Neural Network Scaling Limits

Boris Hanin
Princeton University

Neural networks are often studied analytically through scaling limits: regimes in which taking to infinity structural network parameters such as depth, width, and number of training datapoints results in simplified models of learning. I will survey several such approaches with the goal of illustrating the rich and still not fully understood space of possible behaviors when some or all of the network’s structural parameters are large.


View on Youtube

Back to Workshop II: Theory and Practice of Deep Learning