Much of the field obsesses over end-to-end learning. But strong generalization requires compositionality: building modular, reusable abstractions, and reassembling them on the fly when faced with novelty.
The models of the future won't be just pipes, they will be Lego castles.
The models of the future won't be just pipes, they will be Lego castles.
Comments
Look for Subject-oriented programming.
https://www.greenberg.pro/thoughts/ai-architecture/turings-and-shannons
makes (and has brought to fruition with dspy): that the development of software has always involved additional layers of abstraction.
Like it is either training or inference.
I think future models will employ some "asynchrony" - like neurons fire wherever they want, like when they achieve critical mass and decide to fire.
There still will be some spontaneous sync. emerging