Eliezer Yudkowsky Quote

By far the greatest danger of Artificial Intelligence is that people conclude too early that they understand it.

Eliezer Yudkowsky

By far the greatest danger of Artificial Intelligence is that people conclude too early that they understand it.

Related Quotes

About Eliezer Yudkowsky

Eliezer Shlomo Yudkowsky (born September 11, 1979) is an American decision theory and artificial intelligence (AI) researcher and writer, best known for popularizing the idea of friendly artificial intelligence. He is a co-founder and research fellow at the Machine Intelligence Research Institute (MIRI), a private research nonprofit based in Berkeley, California. His work on the prospect of a runaway intelligence explosion was an influence on Nick Bostrom's Superintelligence: Paths, Dangers, Strategies.