The Training Bottleneck and Model Collapse

Paper 28draftIntroductory
Jon Smirl — March 2026
Abstract. No matter how fast AI inference scales, capability growth is limited by the speed of frontier training --- and when models train on their own output, the diversity that makes AI valuable quietly erodes.