The Human Web
A thought experiment: a Web site with a page for everyone now living, and a page for every human being that's lived on earth, going back to the start of the species homo sapiens. That would be 6 billion pages for the living, and, say, 80 billion for the dead. The pages would be linked into a giant family tree.
     Each page would have a photo, basic statistics such as dates of birth and death, a list of abodes, and a brief catalog of the subject's aspirations and accomplishments.

Mind Limits Computers
The major limitation on the power of computers is that their workings must be understood in detail by humans in order for them to be designed and programmed. For example, modern microprocessors have millions of transistors, enough to make a hundred independent processors operating in parallel in a single chip. If they could cooperate effectively, they'd together have much more computing power than the single processor that we currently build on that chip, which executes one stream of instructions. But we humans have no effective programming discipline that allows us to understand the working of a hundred cooperating automata, even though our own brains are organized this way, with billions of simple neurons connected in a complicated way and operating in parallel.
     The only way we'll be able to develop computers as effective as our brains is to grow them or evolve them, so that their capabilities are no longer limited by the need for humans to understand in detail how they work. But at that point we will no longer completely control them, either.
     If Moore's law continues as it has for decades, in another 20 years the fastest microprocessors will have approximately the same raw computing capacity as the human brain. At that point it will be very interesting to see what sorts of intelligences we can evolve.

It is amazing how much time programmers spend hunting down defects ("bugs") in computer programs. This is because the way humans think is so different from the way computers work. Computers programs follow exact instructions very precisely, so if a piece of information is missing or in the wrong format, or an unforseen condition arises, they "crash." Humans just don't think this way. Because of human error, debugging well-written computer programs takes about as long as writing them, and no one ever finds all the bugs.
     It is very easy to make even a small program too complicated for anyone to understand. Programs that can't be understood can't be debugged effectively. The hallmark of a master programmer is the ability to formulate the solution to a complex problem in a simple way, so it's obvious how it works, and easy to debug.
     Very difficult bugs are ones that are hard to reproduce, such as when several things happen almost at the same time, in a sequence that wasn't anticipated by the programmer. You might have to run the program for several days to make the bug happen once, and then you might not have much information about the details of what caused the problem. The secret to avoiding these bugs is good, correct design. No amount of debugging will fix bad design. A kludgey fix for one problem will just cause another problem.

When virtuoso computer programming is shown in movies and television, it's made to look like playing a video game – it's all quick twitchy moves in graphical environments. In reality, programming is thoughtful, and mostly text-based, more like writing a short story than playing Doom.

Copyright © 2001-2005 by Dean Wallraff. All rights reserved.