News

The authors, Eliezer Yudkowsky and Nate Soares, mean that very literally: a superintelligence — an AI that’s smarter than any human, and smarter than humanity collectively — would kill us all. Not ...