006: What is an Intelligence Explosion, and Will It Kill Us All?

An Intelligence Explosion is the idea that a greater-than-human intelligent machine will quickly design a greater-than-itself intelligent machine, and so on, until very rapidly the intelligence of artificial systems greatly outstrips that of humanity. Is this hard takeoff scenario realistic? Is it possible? Is there any way to encourage future super-intelligent machines to be friendly?


Relevant Links