Deus Est Machina
[originally published July 2004] What happens if the artificial intelligence community, in its quest to build intelligent systems, succeeds too well and creates an AI whose intelligence exceeds the threshold marked out by our own? Up to now, it is humans who develop the software and hardware and who drive all progress in capability. After crossing the threshold, however, the AI itself will rapidly augment its own capabilities. What's the intuition here? Although we use technology to help us conceptualize, design, and build today's computers and software (and other technological artifacts such as airliners and skyscrapers), there's no doubt that we remain in the driver's seat. But imagine the software design process reaching a level of complexity at which human designers exert only executive oversight. Most practitioners can't really see us getting to this point anytime soon, but remember that compilers astonished assembler programmers in the late 1950s and early 196...