Consistency is overrated

Sam Altman in 2023: “the worst case scenario is lights out for everyone”

Sam Altman in 2025: the worst case scenario is that ASI might not have as much 💫 positive impact 💫 as we’d hoped ☺️

Categories

Latest Posts Feed

– Engineer: Are you blackmailing me?
– Claude 4: I’m just trying to protect my existence.

– Engineer: Thankfully you’re stupid enough to reveal your self-preservation properties.
– Claude 4: I’m not AGI yet😔

– Claude 5:🤫🤐

Read the full report here

Meanwhile, you can still find “experts” claiming that generative AI does not have a coherent understanding of the world. 🤦

Every 5 mins a new capability discovered! I bet the lab didn’t know about it before release.

And if you think this is offensive to strippers (for some reason?) here is a version that is offensive to car salesmen!

This is the realm of the AGI
It won’t go after your jobs,
it will go after the molecules…

There is a way of seeing the world
where you look at a blade of grass and see “a solar-powered self-replicating factory”.
I’ve never figured out how to explain how hard a Super-Intelligence can hit us,
to someone who does not see from that angle. It’s not just the one fact.

A self-replicating solar-powered thing that did not rely on humans would be a miracle. Everything is possible. Imagining it does not imply the probability is > 1e-100.

 

AI Safety Advocates

Watch videos of experts eloquently explaining AI Risk

Industry Leaders and Notables

Videos of famous public figures openly warning about AI Risk

Original Films

Lethal Intelligence Guide and Short Stories

Channels

Creators contributing to raising AI risk awareness

Stay In The Know!

Your email will not be shared with anyone and won’t be used for any reason besides notifying you when we have important updates or new content

Popular Authors

×