The Scaling Paradox:
AI capabilities have improved remarkably quickly, fuelled by the explosive scale-up of resources being used to train the leading models. But the scaling laws that inspired this rush actually show very poor returns to scale. What’s going on?
1/
https://www.tobyord.com/writing/the-scaling-paradox
AI capabilities have improved remarkably quickly, fuelled by the explosive scale-up of resources being used to train the leading models. But the scaling laws that inspired this rush actually show very poor returns to scale. What’s going on?
1/
https://www.tobyord.com/writing/the-scaling-paradox
Comments
https://davidmanheim.substack.com/p/is-the-scaling-paradox-meaningful
AI scientists such as Ilya Sutskever and Dario Amodei saw the scaling laws and doubled down on them, rapidly improving cutting edge AI capabilities just via more and more data and compute.
2/
3/
4/
5/
6/
7/