Here's why @VentureBeat is wrong. Grok3 trained on 15x the hardware as Grok2, only to achieve modest gains. This demonstrates lack of scalability of current AI model technologies. Full stop. Even Ilya Sutskever, co-founder of OpenAI, thinks existing approaches to scaling have plateaued.
1/2

Comments