Earth system science is a lucky discipline. The prevailing paradigm of autoregressive modeling is hugely successful. Auto-regressive models comes in two flavors: 1) traditional time stepping of physical equations and more recently, 2) by training large deep learning models with real-world and artificial atmospheric trajectories. Such models can solve a slew of tasks such as weather forecast, climate projection, climate downscaling, extreme even quantification, and many more. Recently, other modeling approaches such as denoising diffusion have emerged as since the can more readily handle limitations in the available datasets such as inadequate spatial or temporal sampling. Nonetheless, in this talk, I will argue there is no free lunch and that the utility of all these models flows from a few key resources: 1) data and 2) compute. I will outline where scarcities in these resources are preventing forward progress and suggest paths forward.