Eliezer Yudkowsky Quote

By far the greatest danger of Artificial Intelligence is that people conclude too early that they understand it.

Eliezer Yudkowsky

By far the greatest danger of Artificial Intelligence is that people conclude too early that they understand it.

Related Quotes

About Eliezer Yudkowsky

Eliezer S. Yudkowsky ( EH-lee-EH-zər YUD-KOW-skee; born September 11, 1979) is an American artificial intelligence researcher and writer on decision theory and ethics, best known for popularizing ideas related to friendly artificial intelligence, including the idea that there might not be a "fire alarm" for AI. He is the founder of and a research fellow at the Machine Intelligence Research Institute (MIRI), a private research nonprofit based in Berkeley, California. His work on the prospect of a runaway intelligence explosion influenced philosopher Nick Bostrom's 2014 book Superintelligence: Paths, Dangers, Strategies.