There's no intelligence to train, for one, and "training" specifically conjures up the idea of synthesis. But it's not, it's using statistics to guess what should come next in a sequence and it can only copy those guesses from existing data.
Comments
Log in with your Bluesky account to leave a comment
I get the co-opting of the term being distasteful, but I don't think that obviates the training that is occurring. I think the ability to build a statistical model and apply those statistical likelihoods based on ever changing criteria is reflective of training and intelligence - just not sentience.
I find language like "hallucinations" to be much more problematic because there's an unnecessary anthropomorphizing that happens with that language that is purely marketing and pr.
Yeah, I just get that feeling from all of it now, largely, I think, because of who's saying it and what they're selling. Like the letter about stopping AI development for a bit so humanity doesn't create Skynet--it all feels like smoke after a bit when it comes from certain people.
I think it's definitely fair to feel that way. I feel afraid that this is going to permanently entrench a small number of people and companies, and they have every incentive to appear thoughtful while amassing power.
Comments