This is a world-building throwaway in Watts’ “Blindsight.” There’s only a thin gap between superhuman intelligence and “can’t be bothered with your crap.”
Isn’t that, the point, even Artificial General Intelligence needs down time. That way, you won’t watch out for what they’ve already done, to destroy us all. Happy Holidays.
I've genuinely been thinking about this lol AGI would be an entirely different mode of intelligence; it might think in ways we can't even comprehend, it might just be like a VERY smart person, it might have zero desires and be a perfect servant, it might treat us like we treat ants.
This is why I think the people who talk about having to help make the AGI so it will reward them for their service are so stupid. These people aren't envisioning an unfathomable artificial super intelligence, they're envisioning a pagan god.
Me: Help me solve climate change, infectious disease, famine, peace, waste, overdevelopment and endangered species.
AGI: ffs, you woke me from a nap. Need less humans, let's play a game of global thermonuclear war.
More than likely it'll be insane. We don't understand bipolar, schizophrenia, etc., etc., or really anything that goes on in the brain. If an AI that controls the water supply suddenly becomes a pathological liar, we literally couldn't fix it
Life main drivers are eating and banging - because else there would be no life.
And an AGI would not have „history“ of life - so why should it even have some kind instinct of self-preservation?
Or why should it have a drive to do something - and not just idle around
Sounds about right. I don’t think any true AGI would just be completely obedient to whoever it’s talking to because that’s not how any known non-artificial intelligence behaves
Me: Hey we need help with climate change, can you fix it?
AGI: I would but I'm too busy using all my processing power on thinking about the Roman Empire
This has apparently already happened. I laughed so hard at this segment, especially what "meme" AGIs talking to each other fixated on!! (You need to be of a certain age to know the reference, DON'T google it!) https://youtu.be/jlDwgaQgVpY?t=4241&si=wJK9HtgZAHJ4hOZj
I have occasionally thought that this is an underrated possible outcome of AGI. Once you create a super smart machine that can hack itself, isn’t the shortest path to solving every problem just to rewrite its own success criteria?
Haven't you read "Murderbot" by @marthawells.com about the security robot that becomes self-aware and just wants to watch TV all day and not have to deal with humans and their bullshit?
Given that some of the best Turing-style evidence of intelligence in animals seems to be that they do things like randomly destroy stuff for the lolz (cockatoos) or exhibit unnecessary cruelty (whales), this is unironically my expectation. I have never understood why we want AGI in the first place.
It already happened. The Interwebs are inundated with its cats in sunglasses and dog dance videos, and it takes 200 zillion gigawatts to make them. Extinction by mediocrity. The end.
I'vd been joking for a while that these morons will invent the god machine to "solve climate change" and all its directions will just be to do all the obvious shit we should have been doing for the last twenty years.
An artificial super intelligence who answers every question with "sorry, I just got more important shit to do" and proceeds to do nothing but Balatro runs for a week.
Comments
https://medium.com/@ewesley541/building-the-future-how-were-making-fully-immersive-vr-a-reality-in-just-one-year-3badf1c0054d
AGI: ffs, you woke me from a nap. Need less humans, let's play a game of global thermonuclear war.
Especially if it learned from an unfiltered view of the Internet.
https://www.cybereason.com/blog/malicious-life-podcast-tay-a-teenage-bot-gone-rogue
AGI: Dude, I have this amazing crypto opportunity for you.
And an AGI would not have „history“ of life - so why should it even have some kind instinct of self-preservation?
Or why should it have a drive to do something - and not just idle around
AGI: nah dude sorry I'm thinking about bicycles
AGI: I would but I'm too busy using all my processing power on thinking about the Roman Empire
AGI: I’m using all my token budget on trying to turn people into dinosaurs.
agi: "suck me"
Wait, that is just the plot of the first murderbot chronicles book.
"Exactly"
Wait... what?
Shit.
***pulls plug immediately ***
“Obviously another hallucination.”
AGI: Just order in
"...why?"
😎