By his replies it seems like he's just used it to chew up dense information when struggling with his dyslexia. And I guess from a tech standpoint there COULD probably be a good use case somewhere in there, if the thing wasn't eating the entire internet and planet and also wasn't wrong all the time.
Thing is, he could pay an assistant to do this- and essentially does pay at least some staff to do this for his content creation. That’s the point of staff. He’s essentially admitting he’s paying openAI instead of a human in some capacity, I guess the specific moral weight is contextual
Think I'd normally be with you on this take but this isn't like he's replaced a team of researchers or whatever. We're talking about something he did by himself with some difficulty, and is now trying to ease that difficulty with tech.
And I'm firmly against anyone using the plagiarism machine to do this, for the record. What I'm saying is it seems super plausible that there could be a world in which the tech, removed from its corporate masters and restrained from its myriad of evil bullshit uses, could be an accessibility tool.
- And also that dealing with a learning disability sucks and is frustrating so I understand excitability and overreaching in the search for accessibility tools.
But he’s specifically talking about LLMs, the actual plagiarism machine, not AI broadly or neural networks broadly. We already extensively use the backbone technologies for eg. science all the time. LLMs specifically have been proven not to work well without basically all the data, so they steal!
Comments
1. It isn't useful and it sucks
2. It told me to put superglue in my spaghetti sauce