June 11, 2007

Yudkowsky

Staring into Singularity, still a strong candidate for the best philosophical introduction to the topic.

Some snippets:

As far as technology is concerned, the Singularity could happen tomorrow. One breakthrough - just one major insight - in the science of protein engineering or atomic manipulation or Artificial Intelligence, one really good day at Webmind or Zyvex, and the door to Singularity sweeps open.

Maybe you don't want to see humanity replaced by a bunch of "machines" or "mutants", even superintelligent ones? You love humanity and you don't want to see it obsoleted? You're afraid of disturbing the natural course of existence?
Well, tough luck. The Singularity is the natural course of existence.

I declare reaching the Singularity as fast as possible to be the Interim Meaning of Life, the temporary definition of Good, and the foundation until further notice of my ethical system.

I think it's safe to say that I can now visualize a complete path leading up to the Singularity, I have some idea of what it would take to get there and how much it will cost, and I think we could probably do it by 2010. Substantially earlier, given a lot of funding and research problems that turn out to be tractable.
So the heck with Moore's Law. The Singularity will happen when we go out and make it happen.

Our sole responsibility is to produce something smarter than we are; any problems beyond that are not ours to solve.