codytfenwick.bsky.social
Research analyst for 80,000 Hours
26 posts
114 followers
171 following
Prolific Poster
Conversation Starter
comment in response to
post
Unfortunately this shows humans will never have true general intelligence. These kinds of hallucinations show their architecture is fundamentally flawed.
comment in response to
post
I think the board can credibly argue that selling the non-profit to Musk does not benefit humanity, and so reject the offer on that ground.
The problem though is the offer, if credible, does make the $40bn offer not a reflection of fair market value, so it strengthens the case for a higher payout.
comment in response to
post
Thank you, I really needed to hear this today
comment in response to
post
More here from the European Space Agency www.esa.int/Space_Safety...
comment in response to
post
If reasonably feasible, I could see a case for trying to deflect YR4, if we find it's on a risky trajectory.
It wouldn't need to pencil out in a terms of strict costs and benefits. The information value, and even potential global cooperation needed, could be really valuable.
comment in response to
post
On the whole, it seems possible (thought still very unlikely) YR4 could cause serious damage and deaths, but it wouldn't be a global catastrophe.
It could be a local disaster.
comment in response to
post
Dense human populations only cover a small portion of Earth's surface though; of course, most of it is water.
If it hit the water, it could trigger a tsunami, which could also cause significant local damage.
comment in response to
post
The Siberian region where it exploded was sparsely populated, so maybe only a few humans died.
Of course, if something like this happened near a city, it would be enormously destructive.
comment in response to
post
YR4 is estimated to be between 40-100m wide. The best reference class seems to be the Tunguska event in 1908.
An asteroid estimated to be ~60m exploded in eastern Siberia, knocking down maybe 80 million trees over several hundred square miles. en.wikipedia.org/wiki/Tungusk...
comment in response to
post
~5% for AI x-risk doesn’t seem very low.
comment in response to
post
Yeah I agree, though we should be careful about applying a unique standard to AI use.
I don't use video games, but that doesn't mean I think the energy use devoted to them is a waste.