Mutable Levels Are Path-Dependent

post by XFrequentist · 2011-05-06T20:47:02.976Z · LW · GW · Legacy · 17 comments

Cyan and I had a good discussion (several, actually) on our long recent drive. Among diverse topics, he explained something that lead to a confusion-reducing revelation.

On meeting Eliezer, Cyan wished to engage him on Measure Theory. Cyan felt that it might be a useful tool for Eliezer's fAI work. The conversation was interesting (I'll leave it to someone more mathematically competent to summarize), but what I remarked on was how it started.

Cyan began, "Eliezer, I'd say you're at least 3 levels above me, but I thought I could offer some advice on Measure Theory..."

I took this to be (needless) status-lowering behavior, and mentioned this in later conversation. However, it came out that this wasn't so, and there was original thinking behind the number "3".

Cyan's formalization of "levels" involved the creation of useful new concepts. Someone a single level above you can create concepts that you can understand, but could not generate on your own.

Cyan felt that Average Physicists were a level above him, Elite Physicists a level higher, and estimated that Eliezer was a level beyond that. Eliezer's original concept of Levels seems to imply that one's level is biologically determined and immutable, and so a single "EY-level" is probably akin to the highest possible "Cyan-level". I will therefore refer to Mutable (Cyan-levels) and Immutable (EY-levels) Levels to distinguish the two. The former is the attainment of one's greatest potential, the latter describes this potential.

I thought that this was a useful way to think about levels and "leveling up", but not completely right. I didn't think that levels were generalizable. Alice, Bob, and Cheryl might form a natural chain in which each was one Mutable Level above the previous person in the chain, but this chain could be very different with different players. Alice, Zorba, Xeno, Yudkowsky, and Cheryl could form an equally logical chain with each person being one Mutable Level above the previous person, so it doesn't make sense to refer to someone as X levels above you.

Cyan and I eventually agreed that this makes more sense, and had the additional benefit providing a useful way to guide seeking out mentors.

Therefore I make several claims I'd like the group's thoughts on:

  1. A useful way to think about "Levels" is as describing increasing ability to produce concepts of greater explanatory power or insight.
  2. It is useful to think of someone as a level above you if they can generate novel ideas that you can only understand, but could not have produced from scratch.
  3. Levels can be usefully thought of as Mutable (if they are amenable to improvement through study or holistic self improvement), or Immutable (if they are biologically - or otherwise - determined and fixed).
  4. One's maximally attainable Mutable Level is equivalent to one's Immutable Level; the latter describes potential, while the former describes the attainment of that potential. 
  5. Mutable Levels are path dependent; it makes no sense to talk about levels abstractly, only in relation to specific individuals (or specific groups whose members have very similar abilities in the domain of interest).

17 comments

Comments sorted by top scores.

comment by Vladimir_Nesov · 2011-05-06T23:31:13.574Z · LW(p) · GW(p)

Being able to understand anything (described in the literature, or through interaction with people who already understand it) given enough study doesn't seem like a terribly high bar, which makes the classification somewhat less useful than it may appear. Actually having an understanding of specific things (rather than merely a potential to attain it) is more relevant.

comment by gjm · 2011-05-06T21:08:33.159Z · LW(p) · GW(p)

Perhaps this is too obvious to need mentioning, but: "Levels" (defined in any way like the one used here) are topic-dependent as well as path-dependent. A might be a couple of levels above B in mathematics but a couple of levels below B in programming, or something. (That particular situation seems unlikely to happen if you're looking at "Immutable Levels", since mathematics and programming are quite closely related -- though it's certainly not obviously impossible. With other pairs of fields it could probably happen with "Immutable Levels" too.)

Replies from: komponisto, XFrequentist, MinibearRex
comment by komponisto · 2011-05-07T01:50:57.456Z · LW(p) · GW(p)

If you think that "Immutable Levels" can be subject-specific, then you're probably forgetting to think reductionistically about the subjects in question.

There is unlikely to be a specific mathematics- or programming-module in the brain (that could be stronger in one person than another), since there wasn't any mathematics or programming per se in the ancestral environment. Rather, what we perceive as "ability" in these subjects reduces to more basic lower-level cognitive abilities, which may differ among people. But there's no law that says everyone has to use the same set of lower-level abilities, in the same way, in order to perform feats of programming or mathematics. There are likely a variety of possible low-level cognitive paths to any higher-level human cultural function, and I suspect that anybody with significant intellectual ability in one domain could develop comparable ability in any other given appropriate motivation, rationality training, and social support.

Replies from: gjm
comment by gjm · 2011-05-07T09:07:58.787Z · LW(p) · GW(p)

I'm sure there aren't specific mathematics and programming modules in the brain. There certainly are different kinds of functionality in the brain (how helpful it is to think of it as divided into modules is debatable) and some of it is more useful for doing mathematics or programming or whatever than other parts.

(Camera A may be better than camera B for landscape photography and camera B better than camera A for taking pictures at rock concerts. That doesn't require that the cameras have landscape-specific modules. All it needs is that, e.g., A has better optics and a higher-resolution sensor, while B has better autofocus and less noise when its sensitivity is cranked way up.)

So, if it sometimes happens that (1) two people's natural mental faculties differ drastically in their "distribution of power" and (2) there's a substantial difference in how useful the bits where they differ are for two different kinds of thinking, then there will be subject-specific "immutable levels".

Prima facie it seems that both #1 and #2 do happen. Of course it might turn out that #2 doesn't really, because (a) differences in intellectual power are all malleable, so that no kinds of "immutable levels" are real, or (b) differences in mental power are concentrated in some universal "g-factor" that affects all kinds of thinking equally, or (c) it just so happens that more specific variations always more or less cancel out overall (it's hard to see how that would work, but who knows?).

I'd be interested to see evidence for any of those, or for any other good reason to reject #1 and #2. So far, though, they look pretty plausible, and I don't see where I've failed at reductionism any further than we all have to on account of not actually knowing very much about how intelligence works.

It's also worth noting that "immutable" means something like "largely fixed from adulthood on" rather than "largely determined by one's genes", and whatever there may have been in our ancestral environments there are all kinds of interesting things in our early environments when our brains are most malleable that might correlate with later ability in fields like number theory, psychology, AI programming, music, etc.

(Examples, in case they're needed, of mental faculties that seem like reasonable candidates for varying differently in different people, and that could be of varying importance across different fields: Visual imagination. Short-term memory. Modelling of other people's likely feelings, thoughts, and behaviour. Language acquisition. Visual pattern-spotting. Auditory pattern-spotting.)

comment by XFrequentist · 2011-05-06T22:47:01.402Z · LW(p) · GW(p)

Of course, agreed.

comment by MinibearRex · 2011-05-06T22:18:47.125Z · LW(p) · GW(p)

I agree. For instance, I have very little experience in programming (limited to one semester of C++ in college), and so I cannot discuss even fairly basic level programming topics intelligently. On the other hand, I can easily discuss high level biology/chemistry etc. I have a good intuitive grasp of mathematics/programming; I've always learned those subjects faster than the people around me, so I think I probably could develop those skills pretty far, but I have never had the time/pragmatic incentive to learn the more advanced stuff.

comment by Manfred · 2011-05-06T22:27:28.366Z · LW(p) · GW(p)

Is the idea also that someone two levels below me would not be able to understand some of my ideas? Because this seems like two different things, making this "level" stuff much fuzzier. And of course it's probabilistic in nature anyhow - on occasion I generate ideas my friend couldn't, and he can on occasion generate ideas I couldn't.

I prefer to measure this stuff in years, which is a function of both experience and potential. E.g. I am about 5 years behind my boss in the sense that in 5 years I think I'll be able to think like him in the areas I most care about, by some fuzzy measure of likeness.

Replies from: candid_theist, XFrequentist
comment by candid_theist · 2011-05-07T18:00:52.803Z · LW(p) · GW(p)

Level is obviously antireflexive. It is a tautology that I will never generate an idea I am incapable of generating.

And of course it's probabilistic in nature anyhow - on occasion I generate ideas my friend couldn't, and he can on occasion generate ideas I couldn't.

Manfred points out that this level concept may not be antisymmetric. Others have pointed out that level may depend on the topic of expertise. For that matter, I'll claim that the concept of level can be applied to artistic pursuits like music, painting, and dancing, not just rational pursuits like math, physics, and programming.

So what if we say: A is higher level than B at topic X if the "value" of ideas per unit time which A generates but B could not is greater than the "value" of ideas per unit time which B generates but A could not. Now we have something antisymmetric.

So now, is this relation transitive? Inversely, is it possible that Alice is higher level at math than Bob, Bob is higher level at math than Carol, and Carol is higher level at math than Alice?

Replies from: Cyan
comment by Cyan · 2011-05-07T18:36:12.812Z · LW(p) · GW(p)

It is a tautology that I will never generate an idea I am incapable of generating.

Reminder: my original idea was

Someone a single level above you can create concepts that you can understand, but could not generate on your own...

...and the concepts generated by someone two levels above you are beyond reach.

Replies from: candid_theist
comment by candid_theist · 2011-05-07T19:23:12.321Z · LW(p) · GW(p)

...and the concepts generated by someone two levels above you are beyond reach.

Interesting, but this second part isn't mentioned in the original post. And the added constraint makes the whole system seem less likely to be useful to me, never mind mathematical rigor. YMMV, I suppose.

Replies from: Normal_Anomaly, Cyan
comment by Normal_Anomaly · 2011-05-07T20:19:39.288Z · LW(p) · GW(p)

And the added constraint makes the whole system seem less likely to be useful to me, never mind mathematical rigor. YMMV, I suppose.

My mileage does vary. I took the added constraint as implied, and I think it makes the whole system more useful.

Replies from: candid_theist
comment by candid_theist · 2011-05-07T21:42:28.489Z · LW(p) · GW(p)

My instinct was to ignore this reply, but I recently read a suggestion that among sufficiently rational people there is never simply a need to agree to disagree. Do you folks on this site have some sort of standard disclaimer that questions are grounded in curiosity, and are not meant to belittle anyone's experience or opinion? In any case, I'm just curious. These questions are directed to Cyan and/or Normal Anomaly and/or anyone else with a similar reaction.

Suppose that within a given domain of knowledge, Alice can create concepts that Bob can understand but not generate, and Bob can create concepts that Carol can understand but not generate. Does this imply:

  • Alice is two levels above Carol?
  • Nothing in particular, because this is not the intended semantic meaning of "two levels above"?
  • Any concept created by Alice is beyond Carol's reach? (I doubt this.)
  • Alice is capable of generating some concepts (at least one) which is beyond Carol's reach?

I'm also confused about what it means for a concept to be beyond someone's reach. The closest experience I can think of is a mathematical theorem I cannot understand. But usually the cause of that is that I do not understand one or more of the definitions or theorems involved in the statement of the theorem itself, and enough study could presumably resolve that.

Or maybe the concept of a concept beyond someone's reach is beyond my reach.

Replies from: Normal_Anomaly
comment by Normal_Anomaly · 2011-05-07T23:28:11.290Z · LW(p) · GW(p)

Disclaimer: Any discussion of XFrequentist's model in this comment is not necessarily how XFrequentist thinks of it, but rather my variant on it.

My instinct was to ignore this reply, but I recently read a suggestion that among sufficiently rational people there is never simply a need to agree to disagree. Do you folks on this site have some sort of standard disclaimer that questions are grounded in curiosity, and are not meant to belittle anyone's experience or opinion? In any case, I'm just curious. These questions are directed to Cyan and/or Normal Anomaly and/or anyone else with a similar reaction.

The community norm is that questions are grounded in curiosity. I've never seen anyone take offense at an honestly asked question.

Suppose that within a given domain of knowledge, Alice can create concepts that Bob can understand but not generate, and Bob can create concepts that Carol can understand but not generate. Does this imply:

  • Alice is two levels above Carol?

  • Nothing in particular, because this is not the intended semantic meaning of "two levels above"?

  • Any concept created by Alice is beyond Carol's reach? (I doubt this.)

  • Alice is capable of generating some concepts (at least one) which is beyond Carol's reach?

My interpretation (assuming this all takes place within one subject area) is that:

  • Yes iff Alice can generate concepts Carol cannot understand,

  • No,

  • No,

  • Probably, but not necessarily (see bullet 1).

I'm also confused about what it means for a concept to be beyond someone's reach. The closest experience I can think of is a mathematical theorem I cannot understand. But usually the cause of that is that I do not understand one or more of the definitions or theorems involved in the statement of the theorem itself, and enough study could presumably resolve that.

If we are talking about Immutable Levels, a concept beyond my reach is one that I will never understand no matter how much I study or how well it is explained to me. I cannot name any concepts I've encountered that seem to be beyond my reach in this sense, except maybe General Relativity. That one could just be a lack of math background.

If we are talking about Mutable Levels, a concept beyond my reach is one that I could not learn without further study of background material.

comment by Cyan · 2011-05-07T19:26:17.620Z · LW(p) · GW(p)

An oversight -- I'll see if I can get XFrequentist to add it in.

comment by XFrequentist · 2011-05-06T23:01:01.176Z · LW(p) · GW(p)

No, I agree that this interpretation isn't useful.

The probabilistic interpretation is neat, that rings true.

Replies from: komponisto
comment by komponisto · 2011-05-07T01:58:59.673Z · LW(p) · GW(p)

Is the idea also that someone two levels below me would not be able to understand some of my ideas?

No, I agree that this interpretation isn't useful.

I don't understand why not; it seems entirely equivalent. (Is this just an act of signaling -- a manifestation of the fact that it's impolite to acknowledge the existence of levels below oneself -- or what?)

comment by Zetetic · 2011-05-08T07:39:38.845Z · LW(p) · GW(p)

I see a few comments focusing on the metric as a valuation of understanding, I suppose a few people may have missed this:

It is useful to think of someone as a level above you if they can generate novel ideas that you can only understand, but could not have produced from scratch.

Bearing this portion of the claim in mind, here are my initial thoughts:

It seems a bit (only a very, very little bit) better to me than simply understanding an idea, but this whole 'levels' business seems far too vague to be useful, especially if we don't have any sort of objective metric for determining exactly how difficult a novel idea is to generate.

Intuitively it seems like the task of creating such a metric would require an understanding of all (or at least most) of the (cognitive and extra-cognitive) factors that play into idea generation, which makes it a very non-trivial task.

On top of the issue of having an objective metric for idea evaluation, we have to determine a valid metric for idea generation potential, because as it stands it appears to me that all we have to go on is the feeling that something is too hard to do, which does not seem even remotely reliable.

Further still, if we want to distinguish between "mutable" and "immutable" levels, it seems that we would have to determine what factors that go into idea generation can be effectively hacked in a reasonable period of time, which is yet another highly non-trivial problem. Bearing in mind these issues, it seems that the "levels" would almost certainly need to be significantly more gradated than this Level += 1, Mutable/Immutable structure.

In addition, even taking into account the reductionist concerns pointed out by komonisto, it seems very likely that bottom-level cognitive features would play into idea generation in various ways such that even if individual tasks were not level-separable (in the sense that gjm is talking about), some sets of tasks likely would be (or at least very nearly be, even if the division isn't always totally clean).