Cover If Anyone Builds It, Everyone Dies
If Anyone Builds It, Everyone Dies

If Anyone Builds It, Everyone Dies

Why Superhuman AI Would Kill Us All

2025

2

0

0

Ma note

1
2
3
4
5
6
7
8
9
10

Lu

Envie de le lire

En cours

Coup de cœur

Ajouter à une liste

Accès rapide

Description

Activités

Livre de Eliezer Yudkowsky et Nate Soares · 2025 (États-Unis)

Genres : Sciences, Version originale
Toutes les informations

AI is the greatest threat to our existence that we have ever faced. The scramble to create superhuman AI has put us on the path to extinction – but it’s not too late to change course. Two pioneering researchers in the field, Eliezer Yudkowsky and Nate Soares, explain why artificial superintelligence would be a global suicide bomb and call for an immediate halt to its development. The technology may be complex, but the facts are simple: companies and countries are in a race to build... Voir plus