Posts

Comments

Comment by DanBurfoot on Artificial Addition · 2007-11-20T14:49:59.000Z · LW · GW

Eliezer,

Did you include your own answer to the question of why AI hasn't arrived yet in the list? :-)

This is a nice post. Another way of stating the moral might be: "If you want to understand something, you have to stare your confusion right in the face; don't look away for a second."

So, what is confusing about intelligence? That question is problematic: a better one might be "what isn't confusing about intelligence?"

Here's one thing I've pondered at some length. The VC theory states that in order to generalize well a learning machine must implement some form of capacity control or regularization, which roughly means that the model class it uses must have limited complexity (VC dimension). This is just Occam's razor.

But the brain has on the order of 10^12 synapses, and so it must be enormously complex. How can the brain generalize, if it has so many parameters? Are the vast majority of synaptic weights actually not learned, but rather preset somehow? Or, is regularization implemented in some other way, perhaps by applying random changes to the value of the weights (this would seem biochemically plausible)?

Also, the brain has a very high metabolic cost, so all those neurons must be doing something valuable.