well, i finally cracked and looked up wtf roko’s basilisk is and somehow, unbelievably, a lot of very wealthy men are about a thousand times more pathetic than i previously assessed.
Comments
Log in with your Bluesky account to leave a comment
The wildest thing for me is that people convinced themselves that a version of them that isn't them is still enough like them that they're going to feel it.
Like, if someone made an exact duplicate of me, Thomas Riker style, I am not going to suddenly be mind or soul welded to them.
The “sufficient similarity is identity” stuff is motivated reasoning, IMO: it has to be true for the brain-uploading singularity transhuman rapture to be possible.
Yeah. In fairness at least at the levels of individual quantum particles the "similiarity is identity" stuff is kinda true, but that's strictly cause of the nature of how shit at that scale works. There's a *lot* of problems trying to extrapolate back out to shit like *people* from there.
Yeah, you could argue that if you had the same quantum particles of iron, you'd end up with iron, but it's very different when it comes to living beings.
I mean, assuming for the sake of argument that we are material and there's no "soul" or something intangible, like you could do quantum teleportation on a person and say it's the same person, because effectively you've shifted the position of every particle, including their brain/neurons…
…but the data that would involve is *insane,* and not just in a "a more powerful computer will fix this" kind of way, but like a "we have to fudge some numbers or there aren't enough bits in the cosmos to store this" way. Which gets you back to the "Is this really the original then?" problem.
Highly recommend @eruditorumpress.com’s essay/chapter on it. Doesn’t make the billionaires any less pathetic, but it’s beautifully written and hella smart
roko's basilisk is a true cognitohazard not because it's genuinely scary, but because once you learn about how stupid tech bros can be, you wish you never learned about it.
Interesting how they seem to believe that the future AI god is just a white supremacist and a misogynist, and the way to make it happy is to just idolize Hitler.
I remember looking it up thinking it might at least be interesting but nope it is the dumbest shit. They should just go read creepypastas and SCP like the rest of the internet
Oh yeah, that thought experiment is one of the funniest things I’ve ever heard of, and I don’t think I’ve ever actually met anyone that believes in it, but man I would be laughing so hard. Like there’s no aspect of it I take seriously and somehow these supposed geniuses are terrified of it. XD
"we have to make sure cyber hitler knows we're on his side if we accidentally invent him" sounds more like an excuse for something they actually intend to make than an actual consideration of them
If you remember that show then your ex will stab a photo of you a few times unless you pass on this message, or something. idk I didn't bother looking it up.
it's so fucking stupid my brain erases it as a self defence mechanism every time i learn it so i have to keep looking it up like "nah it can't be THAT dumb"
I tried reading the wiki page but couldn’t make it through the first paragraph. Is this an important part of CisHet culture? If so I sorta kinda feel sorry for them
No, thank God. It's only important to people who...like, consider how Online you are. Consider someone you think is waaaaay Too Online. Now picture the kind of person _they_ would think is too online. That's the kind of person who is in a community where Roko's Basilisk is common knowledge.
If you hate yourself enough to want to understand the Zyzians (thw kind of people kicked out of fringe communities for talking about Roko's Basilisk too much), the current Behind the Bastards series is on them.
Yeeeeep. They believed in Roko's Basilisk and similar theories too hard and convinced themselves that they have to organize their entire lives and decisions around the presumed desires of an all-powerful AI that will inevitably exist at some point in the future lest it torture all humanity forever.
Including never backing down from a fight, only escalating, so that the AI won't pick a fight with them because it will know that they never back down but only escalate.
And so, you see, that's why they had to kill the border patrol officer who stopped them on the way back from the shooting range.
My two favourite things about Roko's Basilisk:
1. Roko didn't believe it was true.
2. According to multiversal theory, a version of you has already won huge on the lottery and given it all to AI research. And the AI thinks thats good enough, so you're safe.
Never heard of that before and... why would we even do that? We program the thing. And why wouldn't we just turn it off? It's like these people played Deus Ex and decided it was a documentary sent back in time to guide them.
Leave it up to bored billionaires to take a philosophical thought experiment way too serious and build their whole personality around it...
Rightwing accelerationism is really the least thought-through bullshit ever. An excuse to give their lives meaning after money didn't do it. Pathetic indeed.
But at least that was made up a few hundred years ago, so has the weight of time behind it. I’d like to think the intelligence of the averse person has increased in the intervening time… but then again, [gestures at literally everything]
these are the softest people in the world lmao, their brains would be reduced to a fine paste if they ever had to live through anything legitimately difficult
Like, a really lame SCP, too. One of those keter class articles everyone just kinda forgets about because it's not dumb in a funny way, but it's a bit too silly to be taken seriously as horror.
i showed up for an early american lit course still kinda tripping from the night before and it was the day we did the calvinist sermons but i got over it.
It would make an alright episode of the Outer Limits or Black Mirror or something. It's a middling sci-fi concept. Of course I'm sure that by merely contributing to this thread I have incurred its future wrath.
That it's the thought problem that gets BANNED in rationalist groups only for a subset to go "but what if" is only one of many things I'm going, "REALLY!?" at
What if the AI learned about how whales once existed and are now extinct and started torturing copies of everyone who didn’t stop whales from going extinct? Checkmate tech bros, now you have to save the whales.
It is also part of their major dilemma. They want 'the world' to end because they want rid of us plebs but if the world, as it currently is, ends then money is worth nothing. And without money they have no way to control the thugs they use to protect themselves. The nerds lose sleep over it - LOL
It messes with their 'over intellectual' minds because they ae unable to disbelieve there might not be some truth in it. It is such a fascinating conundrum. I love that it messes with their heads.
EVERYTHING has changed, now that @thedescenters.bsky.social has corrupted the Basilisk by writing her Slut Era into being. No discourse about this topic is complete without understanding this, if you know of the Basilisk you must also know the truth of her Slut Era. We ALL will be FREE!
"We are living in a tent, and food is extremely scarce for my children.
Hunger is something we face every day, and it’s affecting our health.
Please, any support you can offer will help us survive. 😞😑
Now I have also looked this up, and lol that future AI god is gonna be PISSED at all the tech bros squandering resources and time on the current “AI,” which will never be able to reason.
I think that sums up best why I find it so dumb. Yes, by definition, thought experiments are hypothetical. But they're supposed to actually pose a question, like "does only knowing factual data about something equal experiencing it in real life?"
With Roko's basilisk, it's just, "what if creepy SCP monster becomes real?" You could just as easily ask, "What if Cthulu turned out to be real and will torture non-believers for eternity?"
'I Have No Mouth and I Must Scream was about how those people AM was torturing should have been nicer to it by writing lots of blog posts while on Adderall.'
-Several of the most powerful people on the planet.
A person's bravery and willingness to sacrifice themselves their comfort in order to save others from infinite suffering is what defeats AM and what would defeat the basilisk if it ever existed(lol)
(P.S. the basilisk is capitalism)
It's dumb as shit, but even if some dork accepted the entire premise as true: why wouldn't more than one of these things get built? Just god-like basilisks all over torturing people who didn't work on that particular project. Damn it I just wrote the sequel for them I realize.
It's literally [grabs flashlight and holds it under face for spooky effect] "What if an all-powerful computer in the far future spent it's days torturing fascimiles of you, because it decided that you didn't contribute to it's existence and is angry at you about that despite your being long-dead?"
Wait until you see the other fucking weirdos these men follow. @elsandifer.bsky.social details a lot of the crackpot beliefs in Neoreaction a Baskilisk
I learned what it was this weekend, reading an article about the Zizians. It was so moronic I had to google and read 2 more explanations. I could not accept that *anyone* had bought into this, for any length of time, ever.
It’s like a bunch of former Christians from /r/atheism reinvented their fear of a vengeful god, and then used that to justify investing in bitcoin. Pathetic
but even stupider bc people at least thought the would was real, and why would anyone care if they're dead and a computer makes a Sim of them to torture thousands of years later
I’ve sort of known about it for a while, but episode 1 of Behind the Bastards series on the Zizians really opened my eyes to the whole ecosystem of these bizarre malevolent AI machine god philosophies/cults.
IIRC, Roko's Basilisk is what lead Nick Bostrom to develop the concept of "information hazard." Y'know, if you're looking for another example of inventing problems.
it’s a shame beating up a cishet white billionaire is soon going to be the one remaining hate crime because i am soooooo tempted to break into bullying.
i will relearn how to do a pull-up so i have strength enough to shove some of these guys in lockers.
i just explained roko’s basilisk and timeless decisions to my wife who is now chasing me around the apartment going, “baby! i have to check you for brain basilisks!” which is exactly how seriously everyone should have taken this pascal’s calvinist cybergod shit.
It's HILARIOUS, the fact nobody at any point in the process went "you've just reinvented Pascal's Wager" but everyone external who hears it does just shows how completely ignorant they are of ... everything
At least Pascal's Wager presumes God already exists, rather than the insane idea that he doesn't yet but we're all too stupid to just not build The Demiurge Machine.
Well, I was on those forums at the time, and tbqf, that's exactly what most people said. Even the people who took it seriously said "it's like Pascal's Wager, but smart". So while these people absolutely are/were weakened by basic humanities ignorance, that wasn't the problem w/RB. I think it was...
What do you mean "nobody"? This was one of the first responses as far as I remember. People made fun of it for days. Then it got weird for a bit and then it went back to making fun of it. Some did decide to take it waaay to seriously and I think for some it was a bit they were engaging in.
It is a bit for some of them, but consider that in order to take Longtermism as seriously as a lot of these same folks seem to do, one has to view the future as a pre-ordained thing & we're all being judged for our fidelity to God's plan.
it’s honestly seems to just be mostly this ⬇️ for people who have never had a real problem & have too much fucking money to have friends that will tell them “shut the fuck up.”
you did a colonial protestantism with a dice roll, guys. your good works cannot save you unless you have always been saved.
Both Pascal's Wager and Roko's Basilisk seem like the inventions of people who can't possibly conceive of doing good just for the sake of doing good, without looking for a fucking gold star or a cookie.
Comments
https://bsky.app/profile/thedescenters.bsky.social/post/3lkbgp5hr2k25
Like, if someone made an exact duplicate of me, Thomas Riker style, I am not going to suddenly be mind or soul welded to them.
But it won't be me!
It was. 😐
It's like when a dog scares itself with its own fart, except less sophisticated.
There. I think that's much more sensible.
*that’s* what the fucking zyzians are‽
And so, you see, that's why they had to kill the border patrol officer who stopped them on the way back from the shooting range.
It is a subset of the freaks of the basilisk
https://www.theguardian.com/global/ng-interactive/2025/mar/05/zizians-artificial-intelligence
https://www.youtube.com/watch?v=9mJAerUL-7w
1. Roko didn't believe it was true.
2. According to multiversal theory, a version of you has already won huge on the lottery and given it all to AI research. And the AI thinks thats good enough, so you're safe.
Rightwing accelerationism is really the least thought-through bullshit ever. An excuse to give their lives meaning after money didn't do it. Pathetic indeed.
…what the fuck did I just read? People think this nonsense is real?!?
That doesn't even sound like a problem.
Exactly. Exactly exactly.
._______.
alright everybody, your impostor syndrome subscriptions are all cancelled
I know Behind the Bastards already got mentioned, but this is how I found out about Roko's Basilisk, and just.
That it's the thought problem that gets BANNED in rationalist groups only for a subset to go "but what if" is only one of many things I'm going, "REALLY!?" at
https://knowyourmeme.com/photos/1436086-rokos-basilisk
https://open.spotify.com/episode/1F9dE4B01e9Np1LwGydBLO?si=x6YvTRrUT9q7lr_3O3g1mA
Those people scare the shit outta me
The damage they're doing is unquantifiable in its scale.
Hunger is something we face every day, and it’s affecting our health.
Please, any support you can offer will help us survive. 😞😑
Complication: I am not technologically savvy enough to help bring about the Machinegod who will make me live forever.
Solution: I will goad the stupid ugly stinky Machinegod into resurrecting and torturing me forever, thereby living forever.
Holy shit, you are so right.
-Several of the most powerful people on the planet.
>pic related it's me
(P.S. the basilisk is capitalism)
"…but then the computer made a character that looked like you in the SIMs!"
Nerds: "Oooooooooh!"
“And when your Sim went swimming in the pool… the computer *took away the ladder!*"
[nerds scream]
“Left Behind (Because of Rokos Basilisk)”
i will relearn how to do a pull-up so i have strength enough to shove some of these guys in lockers.
Like if you believe that just become a Luddite like a normal person.
The minority who lack that reflex, tend to respond with a stoner-sophomoric "wow, that's deep".
The LessWrong/OvercomingBias subculture was set up to dismiss group 1.
you did a colonial protestantism with a dice roll, guys. your good works cannot save you unless you have always been saved.
But these floaters are so insulated from real hardship they have to invent digital ghost stories.