Describe your personal Mount Stupid

post by Shmi (shminux) · 2012-01-03T18:37:38.021Z · LW · GW · Legacy · 48 comments

Contents

48 comments

A little knowledge is a dangerous thing, not sure what the official name for this particular cognitive bias is (feel free to enlighten me). Probably most of us can recognize that feeling of enlightenment after learning a bit of something new and exciting, and not realizing yet how far it is from the mastery of the subject. I suspect that learning the LW brand of rationality is one of those. (Incidentally, if the words "LW brand of rationality" irked you, because you think that there is only one true rationality, consider how close you might be to that particular summit of Mt. Stupid.) See also the last bullet point in the linked comic strip.

As an exercise in rationality, I suggest people post personal accounts of successfully traversing Mt.Stupid, or maybe getting stuck there forever, never to be heard from again. Did you find any of the techniques described in the sequences useful to overcome this bias, beyond the obvious of continuing to learn more about the topic in question? Did you manage to avoid turning Mt.Stupid into the Loggerhead range?

My example: I thought I was great at programming fresh out of college, and ready to dispense my newly found wisdom. Boy, oh boy, was I ever wrong. And then it happened again when I learned some more of the subject on the job...

48 comments

Comments sorted by top scores.

comment by Zed · 2012-01-03T21:29:11.628Z · LW(p) · GW(p)
  1. Macroeconomics. My opinion and understanding used to be based on undergrad courses and a few popular blogs. I understood much more than the "average person" about the economy (so say we all) and therefore believed that I my opinion was worth listening to. My understanding is much better now but I still lack a good understanding of the fundamentals (because textbooks disagree so violently on even the most basic things). If I talk about the economy I phrase almost everything in terms of "Economist Y thinks X leads to Z because of A, B, C.". This keeps the different schools of economics from blending together in some incomprehensible mess.

  2. QM. Still on mount stupid, and I know it. I have to bite my tongue not to debate Many Worlds with physics PhDs.

  3. Evolution. Definitely on mount stupid. I know this because I used to think "group pressure" was a good argument until EY persuaded me otherwise. I haven't studied evolution since so I must be on mount stupid still.

Aside from being aware of the concept of Mount Stupid I have not changed my behavior all that much. If I keep studying I know I'm going to get beyond Mount Stupid eventually. The faster I study the less time I spend on top of mount stupid and the less likely I am to make a fool out of myself. So that's my strategy.

I have become much more careful about monitoring my own cognitive processes: am I saying this just to win the argument? Am I looking specifically for arguments that support my position, and if so, am I sure I'm not rationalizing? So in that respect I've improved a little. It's probably the most valuable sort of introspection that typical well educated and intelligent people lack.

One crucial point about Mount Stupid that hasn't been mentioned here yet is that it applies every time you "level up" on a subject. Every time you level up on a subject you're at a new valley with a Mount Stupid you have to cross. You can be an expert frequentist rationalist but a lousy Bayesian rationalist, and by learning a little about Bayesianism you can become stupider (because you're good at distinguishing good vs bad frequentist reasoning but you can't tell the difference for Bayes (and if you don't know you can't tell the difference you're also on Meta Mount Stupid)).

Replies from: dlthomas
comment by dlthomas · 2012-01-03T21:59:50.385Z · LW(p) · GW(p)

If you're successfully biting your toungue, doesn't that put you off "Mount Stupid", as the y axis is "willingness to opine on topic"?

Replies from: Zed
comment by Zed · 2012-01-03T22:04:14.535Z · LW(p) · GW(p)

I don't think so, because my understanding of the topic didn't improve -- I just don't want to make a fool out of myself.

I've moved beyond mount stupid on the meta level, the level where I can now tell more accurately whether my understanding of a subject is lousy or OK. On the subject level I'm still stupid, and my reasoning, if I had to write it down, would still make my future self cringe.

The temptation to opine is still there and there is still a mountain of stupid to overcome, and being aware of this is in fact part of the solution. So for me Mount Stupid is still a useful memetic trick.

Replies from: dlthomas
comment by dlthomas · 2012-01-03T22:09:40.918Z · LW(p) · GW(p)

Maybe you leveled the mountain? :-P Being "on" the mountain while not being willing to opine just seems like a strange use of words.

Replies from: Zed
comment by Zed · 2012-01-03T22:29:31.419Z · LW(p) · GW(p)

Does Mount Stupid refer to the observation that people tend to talk loudly and confidently about subjects they barely understand (but not about subjects they understand so poorly that they know they must understand it poorly)? In that case, yes, once you stop opining the phenomenon (Mount Stupid) goes away.

Mount Stupid has a very different meaning to me. To me it refers to the idea that "feeling of competence" and "actual competence" are not linearly correlated. You can gain a little in actual competence and gain a LOT in terms of "feeling of competence". This is when you're on Mount Stupid. Then, as you learn more your feeling of competence and actual competence sort of converge.

The picture that takes "Willingness to opine" on the Y-axis is, in my opinion, a funny observation of the phenomenon that people who learn a little bit about a subject become really vocal about it. It's just a funny way to visualize the real insight (Δ feeling of competence != Δ competence) in a way that connects to people because we can probably all remember when we made that specific mistake (talking confidently about a subject we knew little about).

Replies from: dlthomas
comment by dlthomas · 2012-01-03T22:42:31.238Z · LW(p) · GW(p)

I understood it to come from here, but if there's another source or we wish to adopt a different usage I'm fine with that. Actual vs. perceived competence is probably a more useful comparison.

Replies from: Zed
comment by Zed · 2012-01-03T22:52:43.920Z · LW(p) · GW(p)

That comic is my source too. I just never considered taking it at face value (too many apparent contradictions). My bad for mind projection.

comment by Manfred · 2012-01-03T21:41:28.382Z · LW(p) · GW(p)

I think it's a common trend for your first discussion post on LW to be made around mount stupid. Certainly was for me.

I can't think of any current ones, which is partially because I've gotten good at not saying things I don't know, so there aren't things I know I don't know that I still say.

Replies from: Solvent
comment by Solvent · 2012-01-04T05:36:04.030Z · LW(p) · GW(p)

Same for me, back in my first LW account which I deleted out of embarrassment.

comment by Nornagest · 2012-01-03T21:59:02.003Z · LW(p) · GW(p)

not sure what the official name for this particular cognitive bias is (feel free to enlighten me)

I've usually heard it discussed in terms of the Dunning-Kruger effect, although that seems slightly different than the model the SMBC comic describes; subjective certainty isn't quite the same thing as willingness to opine, although they're certainly closely linked.

My own Mount Stupid was highly general and came pretty early; as an older child or a younger teenager I was prone to holding forth on anything I had a model of, even if I'd come up with the model on the spot based on anecdotal evidence. I generally got away with it as long as I was speaking privately with groups that didn't have much collective knowledge; the appearance of certainty can give you a lot of intellectual status.

I don't think it fully went away until I'd lost most of my political partisanship (motivated thinking seems like a great way to stay on Mount Stupid), but the popularization in my mid-teens of modern Internet forums (BBSes and Usenet had been around for a while, but I hadn't discovered them) probably drove the first nails into its coffin. Suddenly intellectual status wasn't defined by being able to say the most reasonable-sounding thing at any given moment; statements were persistent, and could be effectively refuted well after the fact. Basic fact-checking became a necessity, and actual research became a good idea if I was broaching a contentious topic. Eventually it got to be a habit. I'm probably still stuck in a few local maxima on various topics, but even the foothills on the far side of Mount Stupid are a lot less embarrassing than its peak if you spend a lot of time with persistent media.

Rationality techniques are helpful, especially in estimation of confidence, but knowing to use them seems to be more a matter of style than of knowledge; it's all too easy to treat rationality skills as a means to winning arguments. This certainly falls under the umbrella of rationality, and the Sequences discuss it in a number of places (the first one that comes to mind is the arguments-as-soldiers metaphor), but I'm not sure I'd call it a skill as such.

comment by windmil · 2012-01-03T19:59:24.529Z · LW(p) · GW(p)

I remember realizing not too long ago how silly I was being after just having read the Quantum Physics sequence here. I would watch popular science shows and have to have a little rant about how they were ALL WRONG! (Though I still admit any given popular science show can say some silly things) I realized every time I went to explain how they were ALL WRONG I would just say some secondhand (at least) and very opinionated ideas, and realize I didn't have very deep understanding from all that. But I would keep going.

So I've decided to stop, because it's irrational and at least a bit annoying I'd bet.

Replies from: khafra
comment by khafra · 2012-01-03T21:33:12.680Z · LW(p) · GW(p)

I think this is a pretty universal Rule of Stupid for me. Many things that I read about and opine on without actually doing, I later learn enough to feel embarassed about. I've gotten a bit quieter about pieces of knowledge I haven't used to accomplish something.

comment by Morendil · 2012-01-04T09:50:25.821Z · LW(p) · GW(p)

I distinctly remember one occasion when I was instructing a group of software engineers on the topic of "agile planning", and I started drawing a picture of the "cone of uncertainty".

And I stopped dead in my tracks.

Because I'd just realized I hadn't the foggiest idea what I was talking about, or how I'd know if it made sense. I was just parroting something I'd read somewhere, and for once trying to explain it wasn't helping me understand it better, it was just making me confused. And all I wanted to say anyway was "don't trust estimates made at the beginning of a project".

Fortunately nobody noticed (that often happens, and is a topic in its own right), I moved on, stopped using that picture and mostly forgot about it.

This was a few years back, and I like to think that in the meantime I've traversed the valley that lies behind Mt Stupid, and am now ready to start talking about it again. In particular what scares me - you'll be scared too if you google for "software" and "cone of uncertainty" - is how many people in the profession are still stuck on the summit: willing to opine at length about the Cone, without an inch of critical distance from what they're quoting. Is it conceptual, speculative, empirical; if the latter, how well supported? People quoting it don't know and don't care.

Replies from: Karmakaiser
comment by Karmakaiser · 2012-01-09T17:53:17.951Z · LW(p) · GW(p)

I rather like the term "cone of uncertainty." It seems like a spell a third level wizard (or perhaps a junior year philosophy student) could cast.

Replies from: Morendil, Multiheaded
comment by Morendil · 2012-01-09T17:59:56.008Z · LW(p) · GW(p)

Indeed. :)

comment by Multiheaded · 2012-01-14T11:49:55.320Z · LW(p) · GW(p)

Are you sure?

Replies from: Karmakaiser
comment by Karmakaiser · 2012-01-14T14:00:41.176Z · LW(p) · GW(p)

Existentialism is a junior class at my university's philosophy department, so yes.

Replies from: Multiheaded
comment by Multiheaded · 2012-01-15T06:08:54.528Z · LW(p) · GW(p)

I just was making an atrocious joke here.

Replies from: Karmakaiser
comment by Karmakaiser · 2012-01-15T06:27:22.620Z · LW(p) · GW(p)

So was I.

comment by Grognor · 2012-01-04T08:08:37.304Z · LW(p) · GW(p)

I had a big one regarding psychology after taking an introductory psychology course with the best textbook I've ever read on anything, ever. The textbook was so inclusive, and my recall of it so great, that it quickly became very common for me to shoot down (extremely) basic psychology misconceptions, and I acquired an over-estimated view of my understanding of psychology. The primary trouble was how it only covered a few of the standard biases, like hindsight bias and illusion of control, so I thought I was much less biased about psychology than I actually was...

I also had one regarding knowledge. If you talked to me a couple of years ago, I could have waxed endlessly on uncertainty and the unknowable nature of infinity and all sorts of nonsense, which was only starting to dissolve when I finally discovered LessWrong, after which I realized I had been a complete fool, and it dissolved completely

Replies from: tgb
comment by tgb · 2012-01-05T14:14:45.581Z · LW(p) · GW(p)

It's practically cruel of you to say this without telling us the textbook's name. Even after bringing up this downside, I'd want to read it.

Replies from: Grognor
comment by Grognor · 2012-01-06T04:36:12.595Z · LW(p) · GW(p)

I don't even consider that a downside. Heuristics and biases isn't a huge field of psychology. That was me using my knowledge foolishly.

The book is Psychology, by David G. Myers.

(I wanted to recommend it here, but I haven't read enough introductory psychology textbooks to qualify.)

comment by kalla724 · 2012-01-06T21:40:49.515Z · LW(p) · GW(p)

Hm. Let me write a short "Defense of Mount Stupid" (I love the term, by the way; great fan of SMBC, too). Or, to be more accurate, I'm going to muse around the topic, in hope this shakes out some other ideas and thoughts on the subject.

I spent quite a bit of my late teens/early twenties yelling at people from the top of Mount Stupid. As Nornagest describes, internet debates changed everything: once the words were permanently written, I couldn't bullshit as effectively, nor could I easily change the subject if some bit of bullshit gets caught. I found my baseless assertions and half-assed pontifications challenged by people who had better knowledge of the subject and better understanding of the data. It took about two or three spectacular shootdowns, but I eventually learned my lesson. Permanent record is a good thing, and these days I generally refuse to enter into content-intensive verbal debates whenever possible. Let's sit down, sonny, and write our arguments down.

However, I eventually also found my fear of Mt. Stupid quite limiting. One can't become an expert in every area, or sift through the data in all fields that are interesting to you or relevant to one's life. There isn't a real option to ignore those areas either.

Take, for instance, economics. My knowledge is very limited: a few books, a class or two, and reading some blogs from time to time. If I'm honest about it, I have to admit that I will never become an expert in it either: it is highly improbable I will ever have time or motivation to study the area in great depth. And yet I do have an opinion on economics. The current economic policies will have a profound effect on my future life; I have to try and guess what the effects of those policies are, at least in terms of trying to avoid catastrophic outcomes. And I have to choose political candidates, who are pushing economic reforms based (as far as I can see) on understanding of economics even shallower than mine.

Now, it is possible, even likely, that I'm just indulging an illusion. Even professional economists can't really predict future effects of current policies in much detail; I probably shouldn't even be trying. But then, my choices for setting up my economic future become almost a toss of a coin, or have to be based on blind faith into whatever economic adviser I end up with (and how do I choose one in the first place? toss of the coin again?). I am more-or-less forced to believe that my limited knowledge has some value, even if that value is very low.

Fine, let's say this is ok - have an opinion, if you must. The point is, don't opine - if you really understand how limited your knowledge is, don't go out there talking about what you think.

What happens if I just stay quiet? In that case, I will be the only one who has access to my own opinion, vastly increasing the power of my confirmation bias. Sure, I can try to intentionally read opinions of people I disagree with, but that is unlikely to happen: I will rarely be able to summon motivation to read things I believe to be simply wrong. If I do manage to make myself read, I will likely skim and skip, and end up reinforcing my opinion by missing the entire point. Even worse, I don't know what I'm looking for. If there is some bit of economic information that is very important for some aspect of my economic thinking, I probably won't ever find it, or won't realize what it is if I stumble upon it. My understanding of the structure of economics just isn't good enough.

If I do opine, if I risk climbing Mt. Stupid, my opinions will be challenged. This will motivate me to do research and to check both my and my opponent's data, forcing me to keep learning and to keep re-evaluating my position. Others, whether they agree or disagree with me, will come up with things I have never thought of. The opponents won't be writing general opinions (which I can think around, or dismiss as irrelevant due to some particular point or technicality not mentioned by their book or article), they will be trying to undermine my specific position with precisely aimed arguments. If I escape an argument by invoking some detail, they can adapt and expand their argument to block off the escape route. Debate can thus be a cognitive tool that motivates learning and promotes growth.

If the above reasoning holds, it follows that Mt. Stupid should not always be avoided. If I remember the limits of my knowledge, if I try to resist my biases as much as that is possible, it can provide a way for growth in areas that would otherwise remain static.

Thoughts?

Replies from: TheOtherDave, Karmakaiser
comment by TheOtherDave · 2012-01-06T22:12:26.292Z · LW(p) · GW(p)

My usual experience is that when I express my current beliefs as my current beliefs, with some nod to why I hold them, and if I actually pay attention to counterarguments and update my beliefs when those counterarguments are convincing, the end result is that I don't end up climbing Mount Stupid... I'm simply wrong, which happens all the time.

Replies from: kalla724
comment by kalla724 · 2012-01-09T22:52:07.961Z · LW(p) · GW(p)

An entirely valid point, in the narrow definition of Mount Stupid. I used it more broadly to mean "I'm not only holding an opinion about a topic in which I'm not an expert, I'm also in a situation where I have to express that opinion in public." In this case, Mt. Stupid covers your approach (which is the reasonable approach I also use, and which is the closest we can get to rationality). The point of the above was to provoke other people's thoughts on the general approach of what to do when you must have a non-expert opinion.

comment by Karmakaiser · 2012-01-09T17:45:17.894Z · LW(p) · GW(p)

I very broadly agree with you. And I think it would be helpful to discover how one finds oneself on Mount Stupid and how to properly descend. This is all only from my own personal experience and so I well may be on Mount Stupid opining about Mount Stupid. But from whatever hill, mountain and peak I rest on, these have been my observations so far:

Some popularizer will write a book, essay, or regular column and explain in broad strokes a mixture of his opinions, the hard facts and the scholarly consensus of what those facts mean. To a layman that mixture of tonic, toxic and placebo is impossible to sort out. Further, since the popularizer cannot go too many inferential steps ahead of the reader without needed to split a book into multiple volumes, explainers commonly resort to lies to children. Instead of having a healthy respect for the known unknowns and difficulty of a given subject, they feel free to shout from Mount Stupid's peak. After all, it is not they who suffer any unfortunate avalanches as they yell at the top of their lungs half remembered quotes from a journalist's attempt to explain experimental physics. Those on mount stupid did not climb to get there. Somebody build a ski slope of lies to children, narratives and metaphors that lead them up and up until they acquired a degree of unwarranted confidence and a reverence to the bearer of good info. The difficulty of Mount Stupid rests in the fact that you were taken there without the proper tools to get down and where you could go to get those tools is usually unclear or at least costly in terms of time and effort.

How accurate does this judgment seem to your own knowledge of Mount Stupid, and further, what tools other than having your gross ignorance exposed have led you downhill toward expertise and humility?

Replies from: kalla724
comment by kalla724 · 2012-01-09T23:07:33.025Z · LW(p) · GW(p)

How accurate does this judgment seem to your own knowledge of Mount Stupid, and further, what tools other than having your gross ignorance exposed have led you downhill toward expertise and humility?

I mostly agree with your analysis above; I would add, however, one very internal factor. People who do not possess significant expertise in a complex area almost always tend to underestimate the complexity of all complex systems. Even if they read on complexity, they rarely get an intuitive feel for it. So, reading a few popular books doesn't just introduce the problems you have stated above. Since there is no intuitive understanding just how complicated things are, a person feels that the few information they have gleaned are sufficient to make an informed opinion on a topic. IMHO, this general problem also stands behind the popularity of many simplistic (and ultimately destructive) ideologies based on simplistic approaches to complex systems (such as, say, communism or libertarianism).

Along those lines, a thing that helped me a lot in this regard was becoming an expert in a complex field. Seeing how very intelligent people form deeply wrong opinions about things I understand made me very, very aware of similar biases in my thinking about other fields. It didn't cure me from forming such opinions, but it does force me to reexamine them aggressively as soon as I realize their existence within my mind.

comment by Zetetic · 2012-01-04T01:56:59.851Z · LW(p) · GW(p)

I've been on Mount Stupid a lot, maybe enough to be past Mount Stupid's Mount Stupid. I've had a lot of interests that I've developed over the (relatively short) 22 years and I've been caught standing atop Mount Stupid (by others and by myself) enough that I often feel it in the pit of my stomach - a sort of combination of embarrassment and guilt - when I start shouting from there. Especially if no one corrects me and I realize my mistake. The worst is the feeling I get when I've established some authority in someone's eyes and give them wrong information.

The best cure I've found for getting stuck atop Mount Stupid is to start learning a subject that's been the long interest of an honest friend (someone who's default mode of communication - at least among good friends - is significantly closer to Crocker's rules than to ordinary conversation). That seems to have really been the most helpful thing for me. If you take up a subject, you get past Mount Stupid a lot quicker when there's someone to push you tumbling off the top (or at least point out your vast ignorance). It also builds up a reflex for noticing your ignorance - you start to know what it feels like to have a shallow understanding of something and you start to recognize when you're speaking out of your depth. You'll have been conditioned by having been called out in the past. Of course, you're still going to shout from high atop Mount Stupid a lot, but you'll realize what you're doing much more easily.

Caveat: I might still be atop Mount Stupid's Mount Stupid, don't forget that.

comment by EStokes · 2012-01-03T18:59:17.588Z · LW(p) · GW(p)

How could there be more than one true rationality?

Replies from: faul_sname, Dorikka, faul_sname, shminux
comment by faul_sname · 2012-01-03T19:32:41.939Z · LW(p) · GW(p)

While there can't be more than one true rationality, there can be more than one "true rationality".

The LessWrong brand of rationality distinguishes itself mainly by the idea that "rationalists should win." This is a fairly nonstandard definition of rationality, though useful.

comment by Dorikka · 2012-01-03T20:34:06.647Z · LW(p) · GW(p)

Different strategies are unlikely to get you the same result, (i.e, the exact same output for a given utility function), but there may be a number of strategies whose outputs are fairly close to each other, such that which strategy one should optimally employ is very dependent on one's background.

Replies from: EStokes
comment by EStokes · 2012-01-03T21:11:42.673Z · LW(p) · GW(p)

It seems to me that there'd be a perfect rationality, and different strategies to produce results as close as possible to what it would produce.

comment by faul_sname · 2012-01-03T19:31:07.188Z · LW(p) · GW(p)

While there can't be more than one true rationality, there can be more than one "true rationality".

The LessWrong brand of rationality distinguishes itself mainly by the idea that "rationalists should win." This is a fairly nonstandard definition of rationality, though useful.

comment by Shmi (shminux) · 2012-01-03T19:00:41.143Z · LW(p) · GW(p)

There can be less.

Replies from: EStokes
comment by EStokes · 2012-01-03T21:09:15.455Z · LW(p) · GW(p)

What do you mean? It seems to me like the "one true rationality" would be the perfect and unbiased strategy that others tried to emulate, but I'm not sure how it wouldn't exist?

Replies from: prase, Zetetic, shminux
comment by prase · 2012-01-03T21:55:30.313Z · LW(p) · GW(p)

unbiased

Any cognitive strategy is a bias, sort of. Take Occam's razor as illustration: if the truth is complicated, starting with Occamian prior is an obstacle. If the the laws of nature were complicated, Occam's razor would be classified among other cognitive biases. We don't call it a bias because we reserve that word for errors, but it is pretty hard to give a non-circular precise definition of "error".

perfect

Are you sure that it is not the case that for each cognitive strategy there is a better one, for any reasonable metric?

Replies from: fubarobfusco, EStokes
comment by fubarobfusco · 2012-01-04T05:07:09.803Z · LW(p) · GW(p)

Take Occam's razor as illustration: if the truth is complicated, starting with Occamian prior is an obstacle. If the the laws of nature were complicated, Occam's razor would be classified among other cognitive biases.

There are more ways to be complicated than there are to be simple. Starting with a complicated prior doesn't (EDIT: necessarily) get you closer to the complicated truth than starting with a simple prior does. Even if the truth is complicated, a complicated prior can be wronger than the simple one.

Replies from: prase
comment by prase · 2012-01-04T09:52:28.048Z · LW(p) · GW(p)

Yes, a complicated prior can be wronger than the simple one and usually is. I am sure I haven't disputed that.

Replies from: fubarobfusco
comment by fubarobfusco · 2012-01-04T10:06:07.523Z · LW(p) · GW(p)

Sorry, maybe I misread. The line I quoted above seemed to suggest that "if the laws of nature were complicated," then we would be better off having priors that favored complicated beliefs over simple ones — or at least considered them equal — rather than an Occam prior which favors simple beliefs.

Replies from: prase
comment by prase · 2012-01-04T14:01:06.611Z · LW(p) · GW(p)

I have suggested that we would be better off having priors which favour the exact way of how the laws are complicated. Of course, a general complicated prior wouldn't do the job.

comment by EStokes · 2012-01-03T23:18:01.643Z · LW(p) · GW(p)

It seems to me that there would be priors that are useful and those that aren't would biases, and that there would be optimal priors to have.

I don't see why there should be a better strategy for every strategy, either, because one would finally be perfect.

comment by Zetetic · 2012-01-04T01:38:19.481Z · LW(p) · GW(p)

In addition to Prase's comment on the possibility of an unbounded chain of strategies (and building off of what I think shminux is saying), I'm also wondering (I'm not sure of this) if bounded cognitive strategies are strictly monotonically increasing? i.e.( For all strategies X and Y, X>Y or Y>X). It seems like lateral moves could exist given that we need to use bounded strategies - certain biases can only be corrected to a certain degree using feasible methods, and mediation of biases rests on adopting certain heuristics that are going to be better optimized for some minds than others. Given two strategies A and B that don't result in a Perfect Bayesian, it certainly seems possible to me that EU(Adopt A) = EU(Adopt B) and A and B dominate all other feasible strategies by making a different set of tradeoffs at equal cost (relative to a Perfect Bayesian).

comment by Shmi (shminux) · 2012-01-03T22:08:17.940Z · LW(p) · GW(p)

"One size fits all" approach rarely works. Like with CDT vs EDT (I will consider the TDT more seriously when it has more useful content than just "do whatever it takes to win"). Eh, seems like I'm still stuck at the summit on this one.

comment by Cthulhoo · 2012-01-05T18:54:59.423Z · LW(p) · GW(p)

For me it's been definitely physics. In the high school I read a lot of divulgative material on the subject, and thought I was rather knowledgeable. Then I started to study it in the university... and roughly during the middle of my Ph.D. I felt I was ready to discuss it again. I'm still in the valley for music, probably: I thought I knew a lot because I knew a lot about rock music, and at the moment I'm trying to slowly fill my enormous gap in jazz and classical music,

comment by mwengler · 2012-01-05T15:10:42.409Z · LW(p) · GW(p)

I thought I was a great and clear writer with a wealth of ideas until I tried to write a blog.

comment by dvasya · 2012-01-05T05:25:03.464Z · LW(p) · GW(p)

Heh, it feels like I'm always on Mt Stupid on one topic or several, and the actual topics just come and go, only to be replaced with something new. Constantly on the summit of a mountain that moves around,

comment by Dorikka · 2012-01-03T20:28:15.838Z · LW(p) · GW(p)

I thought that I was fairly good at typing, since I typed faster than other people that I knew. Then I realized that I used two fingers to cover about half the keyboard, so something was wrong. :/

Replies from: windmil
comment by windmil · 2012-01-03T20:55:18.060Z · LW(p) · GW(p)

I've had a bit of the same thing. I'm much faster than people who hunt each key, and I don't look at the keyboard anymore, but I'm far from touchtyping. I use about five fingers and one of them I only use for the letter 'a'.