This one equation may be the root of intelligence

post by morganism · 2016-12-10T23:23:13.258Z · LW · GW · Legacy · 11 comments

This is a link post for https://singularityhub.com/2016/12/07/this-one-equation-may-be-the-root-of-intelligence/

11 comments

Comments sorted by top scores.

comment by gwillen · 2016-12-11T22:30:20.574Z · LW(p) · GW(p)

Can we not do clickbait titles on linkposts, please? Let's use the Hacker News rule -- default to the article title, but if it's not a good representation of the content of the article (e.g. it's clickbait), change it to something descriptive.

Replies from: Lumifer, Brillyant
comment by Lumifer · 2016-12-12T16:52:00.633Z · LW(p) · GW(p)

And in general, can we NOT try to evolve in the HuffPo direction?

comment by Brillyant · 2016-12-12T16:39:01.056Z · LW(p) · GW(p)

You won't believe this life changing equation!

comment by Anders_H · 2016-12-12T02:56:28.887Z · LW(p) · GW(p)

Given that this was posted to LW, you'd think this link would be about a different equation..

Replies from: Pimgd
comment by Pimgd · 2016-12-12T11:16:27.305Z · LW(p) · GW(p)

Namely? Bayes? (TBH I wouldn't expect bayes because that'd be wrong, I think - you can have "dumb" intelligence based on reinforcement learning)

comment by onlytheseekerfinds · 2016-12-13T19:33:38.928Z · LW(p) · GW(p)

This equation is simply the sum of each x = i choose k for k in [ 1, i ].

So what he's saying is that the neural circuits that follow the principles he describes have one neuron to represent every possible combination of on/off states in the set of inputs. It's the most brain-dead way you could possibly implement a classifier system.

comment by Viliam · 2016-12-12T18:07:59.224Z · LW(p) · GW(p)

Does the magical 2^i-1 equation predict that the human brain with cca 85-86 billion neurons can only contain 36 different concepts?

Replies from: onlytheseekerfinds
comment by onlytheseekerfinds · 2016-12-13T19:13:30.972Z · LW(p) · GW(p)

From a paper by Dr. Tsien, retrieved from http://www.augusta.edu/mcg/discovery/bbdi/tsien/documents/theoryofconnectivity.pdf

Fifth, this power-of-two mathematical logic confines the total numbers of distinct inputs ( i ) coming into a given microcircuit in order to best utilize the available cell resources. For instance, as a result of its exponential growth, at a mere i = 40, the total number of neurons ( n ) required to cover all possible connectivity patterns within a microcircuit would be more than 10^12 (already exceeding the total number of neurons in the human brain). For Caenorhabditis elegans – which has only 302 neurons, limiting i to 8 or less at a given neural node makes good economic sense. Furthermore, by employing a sub-modular approach (e.g., using a set of four or fi ve inputs per subnode), a given circuit can greatly increase the input types it can process with the same number of neurons. '

He also mentions cortical layering. It seems like he's envisioning the brain as a forest of smaller, relatively shallow networks following the principles he describes, rather than one tree where all neurons are wired together in a uniform way.

comment by morganism · 2016-12-10T23:26:51.591Z · LW(p) · GW(p)

"In stark contrast, Tsien predicts the brain runs on a series of pre-programmed, conserved networks. These networks are not learned; instead, they’re made up of pre-established neural networks, wired according to a simple mathematical principle.

In other words, at a fundamental level the brain’s wiring is innate — the motifs, established by genetics, underlie our ability to extract features, discover relational patterns, abstract knowledge and ultimately, reason."

Brain Computation Is Organized via Power-of-Two-Based Permutation Logic

http://journal.frontiersin.org/article/10.3389/fnsys.2016.00095/full

" the unifying mathematical principle upon which evolution constructs the brain’s basic wiring and computational logic represents one of the top most difficult and unsolved meta-problems in neuroscience"

"This simple mathematical logic can account for brain computation across the entire evolutionary spectrum, ranging from the simplest neural networks to the most complex."

comment by BiasedBayes · 2016-12-11T23:25:26.925Z · LW(p) · GW(p)

Thanks! Very interesting!

comment by entirelyuseless · 2016-12-11T16:59:42.075Z · LW(p) · GW(p)

And the answer to the question about Life, the Universe, and Everything is... 42.