The amount of push I'm suddenly seeing for AI companies to "write my notes for me" is frankly terrifying, because I know too many of my colleagues will do it, and it's going to destroy the therapy industry.
Comments
Log in with your Bluesky account to leave a comment
Because I don't believe for a moment that those AI systems aren't going to use the transcripts they're recording to make a bunch of AI therapists and then undercut and cut us all out, and I have zero confidence that those AI will be actually competent at the job.
And they'll get around regulations by putting some bullshit disclaimer at the bottom that they aren't actually therapists, but the public is already trying to use chat GPT for therapy, people will gobble that shit up.
And you may ask what the problem is, why not have cheaper more accessible therapy? Because disconnection from actual humans is one of the big problems facing us all right now, and replacing the professional connection people with AI will make that worse.
This is of course on top of my worry about becoming unemployed because some f#ing tech bros thought it would be a good idea to replace any mental job ever with a nuclear power plant connected to a data center so that the only work left for most of us will be waiting tables and harvesting crops
Comments