Eliezer Yudkowsky Quote
The purest case of an intelligence explosion would be an Artificial Intelligence rewriting its own source code. The key idea is that if you can improve intelligence even a little, the process accelerates. It's a tipping point. Like trying to balance a pen on one end - as soon as it tilts even a little, it quickly falls the rest of the way.
Eliezer Yudkowsky
The purest case of an intelligence explosion would be an Artificial Intelligence rewriting its own source code. The key idea is that if you can improve intelligence even a little, the process accelerates. It's a tipping point. Like trying to balance a pen on one end - as soon as it tilts even a little, it quickly falls the rest of the way.
Related Quotes
She was a gypsy, as soon as you unravelled the many layers to her wild spirit she was on her next quest to discover her magic. She was relentless like that, the woman didn't need no body but an open r...
Nikki Rowe
Tags:
adventure, artist, authentic, balance, brave heart, courage, empowering women, free spirit, freedom, growth
My first impression of [Patricia Highsmith] was a loneliness, a sadness in one so young (we were both in our early thirties) with absolutely no sense of joy or balance. Gauche to an extreme, really ph...
Patricia Schartle
Tags:
asperger s, aspergers syndrome, autism, balance, boyish, clumsy, distrust, ease, extreme, gauche
Sexual frenzy is our compensation for the tedious moments we must suffer in the passage of life. 'Nothing in excess,' professed the ancient Greeks. Why if I spend half the month in healthy scholarship...
Roman Payne
Tags:
ancient greece, ancient greeks, balance, beauties, cycles, europe, full moon, greece, lifestyle, moon
About Eliezer Yudkowsky
Eliezer S. Yudkowsky ( EH-lee-EH-zər YUD-KOW-skee; born September 11, 1979) is an American artificial intelligence researcher and writer on decision theory and ethics, best known for popularizing ideas related to friendly artificial intelligence. He is the founder of and a research fellow at the Machine Intelligence Research Institute (MIRI), a private research nonprofit based in Berkeley, California. His work on the prospect of a runaway intelligence explosion influenced philosopher Nick Bostrom's 2014 book Superintelligence: Paths, Dangers, Strategies.