The Scaling Paradox:
AI capabilities have improved remarkably quickly, fuelled by the explosive scale-up of resources being used to train the leading models. But the scaling laws that inspired this rush actually show very poor returns to scale. What’s going on?
1/
https://www.tobyord.com/writing/the-scaling-paradox

Comments