Is AI an existential threat?

September 20, 2023
Read by your friendly algorithm (Subscribe: Apple, Google, Spotify, Amazon, RSS)

Knowledge is the most important resource. If you ban innovation you stop progress because the learning stops and that’s the easiest way to add odds to the existential risk category. If errors are not allowed we cannot learn and progress.

You cannot dictate innovation or buy it with money no matter how much politicians like to make you believe so.

Expecting to create an AGI without first understanding in detail how it works is like expecting skyscrapers to learn to fly if we build them tall enough.

David Deutsch

I was in a live TV discussion about this topic yesterday. Below you can watch selected clips and the last one is the full version.


The opportunity cost

Automation and software

The worst version of AI tools are here now

Humans are behind the AI tools

Stopping innovation is the biggest existential risk

How do you stop ideas from spreading?

The full live discussion

YouTube player