Monks of Magnitude

post by [DEACTIVATED] Duncan Sabien (Duncan_Sabien) · 2022-02-18T07:48:58.764Z · LW · GW · 50 comments

Contents

51 comments

Sometimes I encounter a concept and it immediately embeds itself in my culture [LW(p) · GW(p)], such that it feels like I always knew it.  This is a short description of such a concept; it was a near-perfect match for a way I already thought about things, and has since become a useful handle that I find myself explaining to others (so that I can subsequently reference it) roughly once a month.


There is a certain story, which I will not name here in order to reduce the spoiler-y nature of the following description.

In that story, there is a monastery, and the monastery is divided into tiers, or levels.

(I may not be precisely representing the story, here, but rather how the thing in the story ended up being recorded by my brain.)

Some monks are 1-day monks.  They come out from their seclusion once a day, and mingle with the regular people, and share their insights, and make new observations, and then retreat back to their private spaces to muse and meditate.

Some monks are 10-day monks.  They are much like the 1-day monks, except they come out only every 10 days.

The 10-day monks tend to think longer thoughts, and wrestle with subtler or more complex problems, than the 1-day monks.  This is treated by the culture of the monastery as natural and correct.  Of course the 10-day monks address a different set of problems; if 10-day monks and 1-day monks were good for the same purposes, the two different Orders wouldn't need to exist.

There are also 100-day monks, who come out only a few times per year.

There are also 1,000-day monks, who come out only every few years.

There are also 10,000-day monks, who come out only every thirty years or so.

There are also 100,000-day monks (the structure of this society has led them to be better at solving problems overall, which has allowed for some advances in longevity tech).

"Come out" may be a bit of a misnomer; in fact, it is the case that the most valuable insights of a given Order tend to be fully comprehensible only to monks of one, mmmmaaaaybe two Orders below them.  So the 100,000-day monks, when they report in, mostly speak only to the 10,000-day monks, who are responsible for distilling and transferring relevant insights to the 1,000-day monks, who are responsible for distilling and transferring relevant insights to the 100-day monks, etc.

(It's also the case, as some readers have pointed out in the comments below, that certain problems cannot be solved by isolated thought alone, and require feedback loops or regular contact with the territory.  For monks working on such problems, it is less that they sequester themselves completely for thousands of days at a time and more that, during those thousands of days, none can make demands of them.)


Duncan-culture works this way.

(By "Duncan-culture," I mean a culture composed entirely of Duncans; a culture made up of people who, whatever their other differences, take for granted everything that I, Duncan Sabien, find intuitively obvious and believe I could convey to a ten-year-old version of me in a few hours' time.  This society lives on a large island a few hours' sailing off the coast of dath ilan.)

If there were indeed 1,000 literal Duncan-copies available, to found a monastery or any other endeavor, they would immediately stratify themselves into 1, 10, 100, and 1,000-day groups at the very least, and probably there would be nonzero 10,000-day Duncans as well.

The key here is that each of these strata focuses on a set of largely non-overlapping issues, with largely non-overlapping assumptions.

To a 1-day or 10-day monk, questions like "maybe this is all a simulation, though" are almost entirely meaningless.  They are fun to ponder at parties, but they aren't relevant to the actual working-out-of-how-things-work.  1-day and 10-day monks take reality as it seems to exist as a given, and are working within it to optimize for what seems good and useful.

But (of course!) we want some people working on 1,000 and 10,000-day problems!  We don't want to miss the fact that this is all just a simulation, if it is in fact a simulation.  And we don't want to be blind to the implications and ramifications of that fact, and fail to take appropriate action.

So some Duncans are off in the ivory tower, questioning the very fabric of reality itself, because what if?

And other Duncans are in between, taking different subsets of things for granted, while questioning others.

And it's fairly important that the 1,000 and 10,000-day monks not be distracted by such trivial concerns and questions like "how do we navigate continued cooperation within small groups after people have messy romantic breakups?"

(Or, well, most of them, anyway.  Some small number of 10,000-day monks may in fact pay very close attention to exactly those dynamics, because those dynamics might contain Secret Subtle Clues As To How Things Really Work.  There is no restriction, aesthetic or social or otherwise, on a higher-Order monk playing around with lower-order concepts to the extent that they find them useful or intriguing or refreshing or what-have-you.)

But for the most part, the 1,000 and 10,000-day monks are simply ... given what they claim to need.  Their food and lodging is provided for; requests for companionship or certain odd materials are simply granted.  The assumption is that most 1,000 and 10,000-day monks will produce nothing of measurable material value (especially not value comprehensible to a 1-day monk); the society as a whole has decided that it is nevertheless Extremely Well Worth It to fund all such monks, in perpetuity, for the once-in-several-lifetimes breakthroughs that only come from people who are willing to dive deeply into the terrifying Unknown.

Meanwhile, the 1, 10, and 100-day monks are busy improving the functioning of society, exploiting the current paradigm (rather than exploring in search of the next one).  It is their labors which produce surplus and bounty and which, in a sense, "fund" the rest of the Orders.

(These distinctions are not clear-cut.  The boundaries are fuzzy.  This is fine; the monastery is sensible.  Overall, though, the higher your Order, the less accountable you are to the bean-counters.  Our current culture does something similar, though more clumsily, via e.g. tenured positions at universities.)


The reason Duncan-culture works this way is that it seems to be healthy, and sane.  A culture with such a monastery, whose insights had repeatedly proven to revolutionize society, resulting in inventions like consistent judicial policy and microwave ovens and international peace treaties and the general theory of relativity, is one that has practiced taking seriously ideas it does not fully comprehend.  It's a culture that expects to sometimes be told "you do not understand why this is important, but it is."  It's a culture that handles delegation via a chain of trust, and which e.g. "believes the science" in a way that does not devolve into mere tribal signaling whereby n95 face masks become a two-way shibboleth.

It's the kind of culture that e.g. would not fail to see global warming or existential risk from artificial intelligence coming, and would not fail to send the message to its elementary schools and universities "hey, we should start moving promising people into place to solve these problems" years or decades in advance of the deadline.

(There are other kinds of cultures that also avoid these failure modes, but they have other drawbacks.)

It's also the kind of culture that ... effortlessly navigates disagreement about what's important?  You don't get criticisms of "ivory-tower nonsense" or "tunnel-visioned mundanity."  People in such a culture understand, on a deep and intuitive level, that some problems are 1000-day problems, and other problems are 1-day problems, and both are important, and both are important in very different ways.


(Just kidding, but in fact I don't have much of a tying-this-up-in-a-neat-narrative-bow conclusion.  I think the concept is useful, and I think at this point you get it.  My only parting recommendations are these: first, try categorizing the problems that catch your attention, and see if you tend to feel more-at-home in a particular Order.  Second, try looking at various LW posts, and various prolific LW authors, and asking the question "if LW were such a monastery, which Order would this person belong to?"  It makes the sometimes-disorienting diversity of LW content suddenly make a lot more sense, at least to me.)

(EDIT: Oh, a third one: "Are we mistakenly judging a 1,000-day monk by standards that only make sense for 10-day monks, or vice-versa?")

50 comments

Comments sorted by top scores.

comment by johnswentworth · 2022-02-18T17:13:16.532Z · LW(p) · GW(p)

The picture this post paints seems to me wildly unrealistic, in a "this is ignoring one of the most taut binding constraints of socioeconomic reality" kind of way.

The overwhelmingly most likely result of a monk going to their private space to muse and meditate for 10000 days is that they end up completely unmoored from reality, and produce nothing of any practical use at all. This is not a matter of "the 10-day monks aren't able to recognize why the 10000-day monk's products are useful". It's a matter of "the 10000-day monk's products are not useful", because that's what happens when someone spends very long periods trying to do something without a feedback signal. It's that lack of feedback signal which is the problem.

Now, in principle this could be circumvented. The 10000-day monks themselves could recognize the lack-of-feedback-signal as a problem, and go looking for useful feedback signals. Unfortunately, the whole "just give all the 10000-day monks whatever resources they need" thing means that there's no incentive or selection pressure to ensure that the 10000-day monks actually do that.

What actually happens in monk-world is that massive amounts of resources are thrown at 10000-day monks who almost-all do absolutely nothing useful. Even if some of them do figure out useful things, there is no signal with which to distinguish the useful-monks from the masses of 10000-day monks producing crap. Even if some 10000-day monk realizes that that AGI-related X-risk is coming, there will be thousands of other 10000-day monks warning about thousands of other long-term problems, most of which are in fact completely imaginary because none of these monks has any feedback signal. All the memetic dynamics of our world would still apply. "Delegation via chain of trust" would not work any better in monk-world than it does in our world, because there is no better mechanism to enforce truth in monk-world than in our world; there is no better feedback signal in which to ground trust. The 100000-day monks' work would be even less grounded in reality than all those scientific papers in our world which fail to replicate; after all, the 100000-day monks have no particular incentive to care about their work failing to replicate on a 1-year timescale, they can just say "the reasons are mysterious to you" and their bills will continue to be paid.

The bottleneck to good long-term thinking is not just "we don't throw enough resources at long-term thinking". It's "long-term means bad feedback loops, so we can't easily distinguish which long-term thinking is actually useful". If we were able to solve that problem, I expect resources would follow, but simply throwing resources around will not cause the feedback-loop problem to be solved.

Replies from: gwern, Duncan_Sabien, sharmake-farah
comment by gwern · 2022-02-19T00:52:28.563Z · LW(p) · GW(p)

The author of the story in question takes a fairly similar attitude, specifically noting that, without feedback or external contact, often the 10,000-day monks produce only useless things or their monastery opens up on schedule and everyone is dead (I think sometimes of mass suicide but it's been a while). On the other hand, you also at least once get monks who have spooky multiverse-walking quantum-immortality abilities and can defeat aliens. I get the impression that it's intended a bit analogous to an even more extreme 'intellectual venture capital'; the idea being that the long-term monasteries are long-shots which usually fail but just once is enough to pay for them all. The abilities of the quantum-immortal monks are self-verifying in the sense that if you hit the tail of the power law, the incidental spin-off capabilities are so impressive that asking is otiose - similar to a VC investing in Facebook or Airbnb or Bitcoin or Stripe; if you did, you don't really need to do a detailed audit to figure out if you turned a profit! (You did!)

So the real point of disagreement, perhaps, is whether even 1 such success is plausible.

Replies from: JacobKopczynski
comment by Czynski (JacobKopczynski) · 2022-06-30T02:39:03.931Z · LW(p) · GW(p)

Small correction, the level which goes insane or dies is the second level down, out of four total; 100 years rather than 1000.  Though it is roughly 3x the 10,000 day mark would equal, though.  The weaponized use of Penrose Quantum Mind is devised by the top level, who are seen only once per 1000 years.  (As it is written: SciFiWritersHaveNoSenseOfScale. Even the particularly clever ones who play around with big ideas and write extremely ingroup-y doorstoppers.)

comment by [DEACTIVATED] Duncan Sabien (Duncan_Sabien) · 2022-02-18T18:34:47.164Z · LW(p) · GW(p)

Yeah, this was intended to be addressed by:

(It's also the case, as some readers have pointed out in the comments below, that certain problems cannot be solved by isolated thought alone, and require feedback loops or regular contact with the territory. For monks working on such problems, it is less that they sequester themselves completely for thousands of days at a time and more that, during those thousands of days, none can make demands of them.)
 

The claim is not that just giving people resources and letting them think is sufficient; the objections you raise in your last paragraph seem true and correct to me (e.g. there's an incentive thing to be solved, among other problems) but it wasn't so much that I intend the worst possible answer to those objections as that the above essay didn't speak to them at all. =P

(Didn't speak to e.g. how do you select 1,000 and 10,000 and 100,000-day monks, how do you sort them to their respective problems, how do you motivate and evaluate them, etc.)

Replies from: philh
comment by philh · 2022-02-18T23:34:38.837Z · LW(p) · GW(p)

How much work was "culture of Duncans" supposed to be doinng?

In particular, I kind of read it as "we can assume (among other things) common knowledge that everyone is basically cooperative with everyone else". So people don't become 10,000 day monks just because they want to be supported while they doss around. Is that intended?

If Duncans don't all basically cooperate with other Duncans, or if it's supposed to be more like "a culture where Duncans have a lot of institutional power but a lot of the population isn't a Duncan", I become a lot more skeptical, while acknowledging that if you do think it would work in that sort of situation, you've probably thought of the same objections I have.

Replies from: Duncan_Sabien
comment by [DEACTIVATED] Duncan Sabien (Duncan_Sabien) · 2022-02-20T01:00:09.944Z · LW(p) · GW(p)

A lot of the work was being done by "culture of Duncans."

All of that is double-clickable/expandable, but it's hard to expand all of it for any single essay; that's why so many essays.  =)

comment by Noosphere89 (sharmake-farah) · 2023-10-01T14:57:22.611Z · LW(p) · GW(p)

This, in so many words was likely one of the biggest factors why Anthropic/OpenAI/Deepmind were so much more successful than any LW person or group at AI safety until maybe 2021 at the earliest, and even then the lead shifted.

A lot of AI safety proposals before deep learning were basically useless because of the feedback loop issue.

I think this is also connected to ambition issues, but even then lack of feedback loops was way worse for LW than they thought it was.

It's also why longtermism should be bounded in practice, and a very severe bound at that.

Edit: The comment that johnswentworth made is pointing indirectly at a huge problem that affects LW, and why I'm not inclined to treat arguments for doom seriously anymore, in that there are no feedback loops of any kind, with the exceptions of the AI companies.

Replies from: johnswentworth
comment by LoganStrohl (BrienneYudkowsky) · 2022-02-18T08:37:24.482Z · LW(p) · GW(p)

Wanted: 

* a practical guide to talking to the order below you
* a practical guide to talking to the order above you
* a list of diagnostic tests to help grantmakers sort applications by order
* a list of funding strategies for different orders
* a list of examples of each order of problem for several different domains
* some advice for navigating life and the world if you are a monk of order n, including likely obstacle and how to approach them
* how to recognize when you are approaching a problem of order n+1 with order n strategies, how to recognize when you are approaching a problem of order n with order n+1 strategies, and some things to try in each case
* probably other things i'll think of later when it's not two and a half hours past my bedtime

Replies from: Spiracular
comment by Spiracular · 2022-02-18T16:03:15.427Z · LW(p) · GW(p)

Just distilling some relevant intuitions:

(This is all me thinking about the problem, and I make no claim that others will align with me on these.)

Grant-making

  • Typically, clusters of applications are assessed in 1 rung lower time-frames? Some topics get bumped up to same-tier of 1-tier-higher assessment, if assessors feel it advised.
    • ex: 10E4 is scheduled such that one can submit a grant application, do a 10E3 round, and come back to an answer.
      • cont ex: If the grant-making is running late, there are a bunch of 10E2 jobs opened up to help speed finishing it along? (The number of these is tracked, and marked on the 10E3 assessor's record, but this is generally considered better than running late.)
  • Low-order grant-making is mostly fast-turnaround for small amounts of money, and involves a lot of funding-based-on-past-accomplishments. About 1/2 of it gets funded informally by random friends, drawing from built-up social credit.
    • Involves "rant-branch" and "brief-branch" application rounds (about 10x as many are funded through brief-branch). Rant-branch has higher word limits, and slower turn-around.
      • Rant is for "this is a potentially-valuable idea I haven't distilled yet, is it worth putting in additional time to investigate/summarize/distill it?" It's reasonably common for it to get funded to 1 tier higher than the original grant-seeker lodged.
  • Your first round of 10E1s are funded by the state. There are also periodic "jubilees" where everyone gets a free 10E1 (solving starter problems)
  • High-order grant-making is slower in turnaround, and includes a lot of questions about order-of-magnitude and scope-of-problem/scope-of-potential-solution.
    • Writing summary sequences or field-wide Review articles are one potential way higher-order people demonstrate legible competence.
    • Some 1E(x) people spend their lives dedicated to distilling ~1-10 1E(x+1) persons for the sake of 1E(x-1) (sometimes called ghostwriters)

Some even-less-ordered thoughts on this:

  • There's probably some kind of rotating board full of scattered pre-structured 1E1 and 1E2 job listings that people have pre-funded, which fast-turnaround people can pull from if they don't have a unique 1E1 or 1E2 idea themselves.
    • (1E0 might be too small for this? Hm.)
    • Similarly, 1E2s can pool together the resources to fund a 1E3 on an issue they find relevant. This has somehow been streamlined, method TBD.
  • Somebody needs to be able to restructure and break a subset of 1E(x) problems into 1E(x-1) and 1E(x-2) jobs.
    • People who do this successfully, should probably gain a fair amount of prestige for it (especially if they break it into a smaller time-block in total, or have enabled substantial parallelization of something time-sensitive)
  • 1E(4)s and higher often develop obnoxiously dense tangles of infralanguages and jargon, as a manner of course. This is treated as normal. 1E(4)s who remain legible to 1E(2)s (especially those who are able to translate other 1E(4)'s work to something similarly legible) are called "bridges," and are appropriately prized.
    • This overlaps some with ghostwriters, but is also its own distinct sub-category of researchers. (It's important that bridges aren't all subordinated into 1E(3) work.)
    • There are occasionally bridge & ghostwriter conferences, which tend to be followed by a tidal wave of applications to write or update various dictionaries and encyclopedias.
    • Similar to the current world: Some fields/jargon-sets/infralanguages that have solved the onboarding problem, are widespread enough to have their own conferences, specialized grant ecosystems, assigned ghostwriters, etc.
comment by weft · 2022-02-18T16:02:23.244Z · LW(p) · GW(p)

I felt a lot of internal resistance and push back when reading this. I agree that this is NOT WHAT YOU SAID, but I feel like there is already a lot of memery and pressure to let the long term / Mission folks be social free riders and leeches in every other part of their lives and I don't like it. My brain pattern matched this post into that meme space.

Replies from: Duncan_Sabien, pktechgirl
comment by [DEACTIVATED] Duncan Sabien (Duncan_Sabien) · 2022-02-18T18:37:16.729Z · LW(p) · GW(p)

Hm!  Interesting.  Thank you for the ... bravery? ... of noting the objection to a perceived social pressure out loud.  Never mentioned = never fixed.

comment by Elizabeth (pktechgirl) · 2022-02-22T06:56:12.514Z · LW(p) · GW(p)

I flinch a little at the tone of this (while also finding it extremely understandable) but want to throw my support behind something like "EA and rationalist memes/values lend themselves to free riding/not tracking village-level negative externalities and that has larger costs than are currently tracked".
 

comment by [DEACTIVATED] Duncan Sabien (Duncan_Sabien) · 2022-02-18T09:58:26.396Z · LW(p) · GW(p)

BTW I claim I am clearly a 10-day monk, according to this system.  Like, I sometimes dabble in 1-day problems and occasionally try my hand at 100-day problems, but the vast majority of my stuff is both a) relevant on the order of 10-day intervals and b) produced with something resembling 10 days of focus.

comment by philh · 2022-02-18T15:59:06.134Z · LW(p) · GW(p)

Second, try looking at various LW posts, and various prolific LW authors, and asking the question “if LW were such a monastery, which Order would this person belong to?”

The answers that came to mind for me were:

  • John Wentworth is a

    hundred or thousand day monk

  • Scott Alexander is a

    ten or hundred day monk

  • Raemon is a

    hundred or thousand day monk

  • Eliezer is a

    thousand or ten thousand day monk, but most of his published work (the sequences, HPMOR, inadequate equilibria) came from ten or hundred day subproblems

Replies from: Raemon
comment by Raemon · 2022-02-18T19:45:14.802Z · LW(p) · GW(p)

Man I am sort of tickled that I give off that vibe. (I think in practice I'm currently more like a 1 day monk, and in the world where this was more noticed-supported it might make sense for me to be a 10 day monk)

comment by Gunnar_Zarncke · 2022-02-18T15:04:40.387Z · LW(p) · GW(p)

Aha, this is why there are so few commenters on OvercomingBias. Robin Hanson is clearly a 10000+ day monk with unusual productivity and the commenters can not be below 100 day monks - thus so few. This is proven by Bryan Caplan's quote:

When Robin Hanson tells me about his latest research, my standard reaction is 'No way! Impossible!' Then I think about it for years. -- https://en.wikipedia.org/wiki/Robin_Hanson#cite_note-12 

comment by justinpombrio · 2022-02-18T18:21:31.168Z · LW(p) · GW(p)

If you want a description of such a society in book form, it's called:

It might answer some people's questions/concerns about the concept, though possibly it just does so with wishful thinking. It's been a while since I read it.

Replies from: Bojadła
comment by Bojadła · 2022-02-19T19:44:34.999Z · LW(p) · GW(p)

There is a certain story, which I will not name here in order to reduce the spoiler-y nature of the following description.

Yes this is the book. Maybe you could put it into spoiler tags in your comment.

Replies from: justinpombrio
comment by justinpombrio · 2022-02-19T20:04:46.654Z · LW(p) · GW(p)

Yikes I missed that, thank you.

comment by gjm · 2022-02-18T14:44:39.267Z · LW(p) · GW(p)

An apposite quotation from J E Littlewood's "Mathematician's Miscellany" -- not talking about the exact same thing as here, but related:

Speed in creative mathematics. I say there is a speed over 10-20 seconds, 10-20 minutes, 1-2 hours, 1-2 days, weeks, months, years, decades. And people can be in and out in different classes. With a collection of really difficult problems, nothing happens in a year; much happens in 10 years. (This leaves the beginner cold.) [...] At a guess I should say I was (1960) quick over 20 minutes, 6 weeks, 1 year.

(My memory of this is that he originally wrote "the same man", not "people"; either my memory is wrong or the edition I have now, more recent than the one I read first, has been updated in line with modern sensibilities. "The same person" seems like it's actually the right phrase.)

[EDITED to fix a typo.]

comment by Henry Prowbell · 2022-02-18T11:10:30.348Z · LW(p) · GW(p)

I have a sense of niggling confusion.

This immediately came to mind...

"The only way to get a good model of the world inside your head is to bump into the world, to let the light and sound impinge upon your eyes and ears, and let the world carve the details into your world-model. Similarly, the only method I know of for finding actual good plans is to take a bad plan and slam it into the world, to let evidence and the feedback impinge upon your strategy, and let the world tell you where the better ideas are." - Nate Soares, https://mindingourway.com/dive-in-2/

Then I thought something like this...

What about 1,000-day problems that require you to go out and bump up against reality? Problems that require a tight feedback loop?

A 1,000-day monk working on fixing government AI policy probably needs to go for lunch with 100s of politicians, lobbyists and political donors to develop intuitions and practical models about what's really going on in politics.

A 1,000-day monk working on an intelligence boosting neurofeedback device needs to do 100s of user interviews to understand the complex ways in which the latest version of the device effects it's wearers' thought patterns.

And you might answer: 1-day monks do that work and report their findings to the 1,000 day monk. But there's an important way in which being there, having the conversation yourself, taking in all the subtle cues and body language and being able to ask clarifying questions develops intuitions that you won't get from reading summaries of conversations.

Maybe on your island the politicians, lobbyists and political donors are brought to the 1,000-day monk's quarters? But then 'monk' doesn't feel like the right word because they're not intentionally isolating themselves from the outside world at all. In fact, quite the opposite – they're being delivered concentrated outside-world straight to their door everyday.

If the 1,000 day problem is maths-based you can bring all the relevant data and apparatus into your cave with you – a whiteboard with numbers on it. But for many difficult problems the apparatus is the outside world.

I think the nth order monks idea still works but you can't specify that the monks isolate themselves or else they would be terrible at solving a certain class of problem – having deep thoughts which are powered by intuitions developed through bumping into reality over and over again or that require data which you can only pick out if you've been working on the problem for years.

Replies from: Duncan_Sabien, BrienneYudkowsky
comment by [DEACTIVATED] Duncan Sabien (Duncan_Sabien) · 2022-02-18T13:03:46.001Z · LW(p) · GW(p)

Yes, good point (and thanks).

Perhaps we should change "come out" to "must report in," at least for some subset of 1,000-day monks who do indeed need to continually bump into the territory.

EDIT: this led to an edit in response!  Double thank-you.

comment by LoganStrohl (BrienneYudkowsky) · 2022-02-18T18:05:51.447Z · LW(p) · GW(p)

(this is the same question i was trying to ask in another comment but you did it better.)

comment by LoganStrohl (BrienneYudkowsky) · 2023-12-21T04:47:47.869Z · LW(p) · GW(p)

This post helped me relate to my own work better. I feel less confused about what's going on with the differences between my own working pace and the pace of many around me. I am obviously more like a 10,000 day monk than a 10 day monk, and I should think and plan accordingly. 

Partly because I read this post, I spend frewer resources frantically trying to show off a Marketable Product(TM) as quickly as possible ("How can I make a Unit out of this for the Workshop next month?"), and I spend more resources aiming for the progress I actually think would be valuable ("In the world where I have robustly solved X one year from now, what happened in the intervening twelve months?").

Outside of academia (or perhaps even inside of it, at this point), our society does not really have a place for monks of the larger magnitudes, so it's uncomfortable to try to be one. But if I'm going to try to be one, which I absolutely am, it's awfully helpful to be able to recognize that as what I'm doing. It impacts how I structure my research and writing projects. It impacts how I ask for funding. It impacts how I communicate about priorities and boundaries ("I'm not scheduling meetings this quarter.") 

I plot my largest project on a multi-decade timescale, and although there are reasons I'm concerned about this, "lots of other people don't seem to commit to such things" is no longer among them.

comment by LoganStrohl (BrienneYudkowsky) · 2022-02-18T08:39:40.611Z · LW(p) · GW(p)

Does Duncan culture have people who aren't monks at all in the relevant sense, and if so what does that mean?

Replies from: Duncan_Sabien
comment by [DEACTIVATED] Duncan Sabien (Duncan_Sabien) · 2022-02-18T08:49:17.041Z · LW(p) · GW(p)

Many, though non-monks nevertheless find it handy to think about and talk about their endeavors as 1-day concerns, 10-day concerns, 100-day concerns, etc.  (e.g. most CEOs of large companies think of themselves as needing to be aware of and solve or pre-empt 10,000-day concerns on a regular basis.)

Replies from: BrienneYudkowsky
comment by LoganStrohl (BrienneYudkowsky) · 2022-02-18T08:59:31.615Z · LW(p) · GW(p)

how do you distinguish a 100-day monk from someone who finds it handy to think and talk about their endeavors as 100-day concerns?

Replies from: Cernael
comment by Cernael · 2022-02-18T09:36:03.318Z · LW(p) · GW(p)

My intuition is that the difference between monks and regulars is that monks have a narrower magnitude range. Like, a 10k monk would avoid wasting focus on too many 1- and 10-day problems - compare the stereotype of the aloof genius' ineptitude at dealing with the 0.01-day problems of everyday life - whereas people outside the monastery trade that focus on the problem class for a wider versatility.

comment by Nora_Ammann · 2022-02-18T11:13:11.043Z · LW(p) · GW(p)

Curious what different aspects the "duration of seclusion" is meant to be a proxy for? 

You defindefinitelyitly point at things like "when are they expected to produce intelligible output" and "what sorts of questions appear most relevant to them". Another dimension that came to mind - but I am not sure you mean or not to include that in the concept - is something like "how often are they allowed/able to peak directly at the world, relative to the length of periods during which they reason about things in ways that are removed from empirical data"? 

Replies from: Duncan_Sabien
comment by [DEACTIVATED] Duncan Sabien (Duncan_Sabien) · 2022-02-18T13:06:42.560Z · LW(p) · GW(p)

As Henry points out in his comment, certainly at least some 1,000 and 10,000-day monks must need to encounter the territory daily.  I think that for some monks there is probably a restriction to actually not look for the full duration, but for others there are probably more regular contacts.

I think that one thing the duration of seclusion is likely to be a firm proxy for is "length of time between impinging distractions."  Like, there is in fact a way in which most people can have longer, deeper thoughts while hiking on a mountainside with no phone or internet, which is for most people severely curtailed even by having phone or internet for just 20min per day at a set time.

So I think that even if a monk is in regular contact with society, the world, etc., there's something like a very strong protection against other people claiming that the monk owes them time/attention/words/anything.

comment by LoganStrohl (BrienneYudkowsky) · 2022-02-18T08:38:58.942Z · LW(p) · GW(p)

What order am I, and how can you tell I'm that one rather than a different one?

Replies from: Duncan_Sabien
comment by [DEACTIVATED] Duncan Sabien (Duncan_Sabien) · 2022-02-18T08:51:03.022Z · LW(p) · GW(p)

You strike me as being a 1000-day monk who frequently spends time on 100-day problems as fuel for your 1000-day thoughts.

I got this answer based on looking at your concrete investigations, many of which are literally 100-day projects, but feeling that this was "selling you short" when it came to scope and magnitude and patience.

Replies from: BrienneYudkowsky
comment by LoganStrohl (BrienneYudkowsky) · 2022-02-18T09:07:32.067Z · LW(p) · GW(p)

>who frequently spends time on 100-day problems as fuel for your 1000-day thoughts

Ah yes this is the other question I couldn't put my finger on until you said this: What about how engaging with lower-level problems is obviously necessary for making efficient progress on higher-level problems? What about how every order 5 monk should also be a monk of orders 4, 3, 2, and 1?

(I'm more sure about the second question than the first. The first was an incorrect guess about why the second thing.)

Replies from: Duncan_Sabien
comment by [DEACTIVATED] Duncan Sabien (Duncan_Sabien) · 2022-02-18T09:57:10.758Z · LW(p) · GW(p)

Perhaps one must pass through each order in order to reach the order above; it's a small-enough investment of time, after all (only 10% of an ordinary cycle!).

That being said, I don't buy that it's necessarily all that important to engage with lower-level problems, or at least not all sorts?  One can be a long-term deep math thinker and use lots of lower math but not know much at all of how to sort out the laundry.

I think there is no restriction, aesthetic or social or otherwise, on a higher-Order monk playing around with lower-order concepts as they find useful or intriguing or refreshing or what-have-you.

Replies from: Pattern
comment by Pattern · 2022-02-19T06:37:49.252Z · LW(p) · GW(p)

After 100 days a monk emerges having created the best way to cut pancakes.

The rest of the monastery will miss the constant supply.

comment by Elizabeth (pktechgirl) · 2022-02-21T21:45:56.218Z · LW(p) · GW(p)

The following immediately jumped out at me when reading this, but was novel to other people I discussed it with so seems worth writing down:

A major use I see for monk levels is letting two people who disagree about a project clarify their disagreement. Different projects have different feedback loops, resource demands, likelihood of success, impact if successful etc, and those things aren't independent: a project with a very low likelihood of success could make up for it by being cheaper, or having more impact if successful. A project with large risks if done wrong requires better feedback loops than a project where the worst that happens is nothing. 

I see monk levels as shorthand for "this is the kind of project that has feedback levels on delay X, has expected value of magnitude Y, and requires resources of magnitude Z". There will projects that don't slot neatly into this, but I think these things are correlated enough that the shorthand is useful. 

Once you have this concept, you can stop talking about "does this project have a fast enough feedback loop?" and move to "is this feedback loop in accordance with its demands and risks?" You can clarify if you disagree with someone on whether all projects should have faster feedback loops and project A's are inherently unacceptable, or if they're totally on board with some projects having very delayed feedback loops but think A in particular is making demands that can only be justified with a fast feedback loop. 

None of these trade-offs are impossible to discuss without the monk-level concept, but now that I have it I expect them to be easier to discuss and reason about.

Replies from: Raemon
comment by Raemon · 2022-02-21T22:12:20.227Z · LW(p) · GW(p)

I like this, but feel like monk levels are a bit too abstracted away from the specific questions you point at here. I feel like the monk-levels are good poetry but I'd like to a see a post that operationalizes the poetry more.

Replies from: pktechgirl, pktechgirl
comment by Elizabeth (pktechgirl) · 2022-02-23T20:45:49.382Z · LW(p) · GW(p)

The medical system provides a really clear example of this.

The following is a rough summary of credentials that allow you to be a nurse practitioner. It's been a long time and I might have the names of the credentials wrong, but I'm really sure that these were the levels as described to me by a school recruiter:

PhDs in NP do research that is practical compared to e.g. biologists but not immediately translatable to patient care.
Master of Science in Nursing do clinical practice
Doctorates in NP translate between the PhDs' research and clinical practice, including doing their own research on implementation details.

There's interplay between these groups, and also people play at different levels to inform their main levels.  A given DNP might talk to MSNs to learn what problems they have, or they might work one day a week in a clinic.  They have to read enough PhD research and foundational science to understand what's worth translating into practice.

My impression is NP PhDs don't do clinical work nearly as much (although one of the nice things about nursing is flexibility, so probably some do, if only for the money). They're selected for ability to do research and that's how they spend their time- but they do need to be able to talk to DNPs, or their work will never be used.

You see the same continuum in doctors, with a slightly less convenient naming scheme. MDs could be anywhere between 100% clinical practice and 100% advising other people who do medical research while not actually running any studies themselves.   The research done by MDs is attempting to practicalize the more foundational work coming out of biology, who are typically professors with what used to be very long time horizons. 

And of course both NPs and MDs are dependent on line-level nurses and certified nursing aids doing the minute-to-minute care and responding when the machines beep. 

comment by Elizabeth (pktechgirl) · 2022-02-21T22:17:15.910Z · LW(p) · GW(p)

I agree and hope that happens. 

comment by Ben Pace (Benito) · 2022-02-19T01:58:53.150Z · LW(p) · GW(p)

This society lives on a large island a few hours' sailing off the coast of Dath Ilan.

Today’s the day we learn Duncan!culture doesn’t respect other cultures’ grammatical choices to not capitalize their own names, not even when they’re close neighbors! :)

Replies from: Pattern
comment by Pattern · 2022-02-19T06:35:34.912Z · LW(p) · GW(p)

10,000 days in a tower may be less than conducive to maintaining grammatical mastery.

comment by Sable · 2022-03-04T10:45:14.492Z · LW(p) · GW(p)

It occurs to me - and this is sort of a nitpick, and sort of a genuine thing I think is worth exploring - that a default of base 10 may not be correct here.

For more granularity, we could try base 2:

1, 2, 4, 8, 16, 32, etc. day problems.

Base 8 may make more sense in general:

1, 8, 64, 512, 2048, etc.

I'd be very interested to see if a different base produced different results, or mapped better to different people's preferences.

Converted (very very roughly) to years:

Base 10:

.003 - immediate, .03 - next week, .3 - near future plan, 3 - near-term life plan, 30 - long-term life plan, 300 - far future/descendants, 3000 - interplanetary , 30000 - intergalactic

Base 8:

.003 - immediate, .02 - next week, .16 - next two months, 1.4 - near-term life plan, 5.5 - mid-term life plan, 44 - long-term life plan, 352 - far future/descendants, 2816 - interplanetary, 22548 - intergalactic

The latter seems to me to scale better to (at least current) human lifespan.

comment by Thoroughly Typed · 2022-02-19T22:33:54.322Z · LW(p) · GW(p)

I really like this concept!

Reminds me a bit of Scott's Ars Longa, Vita Brevis.

comment by Gunnar_Zarncke · 2022-02-18T16:25:54.787Z · LW(p) · GW(p)

You take it for granted that everybody in Duncan Culture accepts and understands this but I think you should at least hint at why and how. 

comment by Gunnar_Zarncke · 2022-02-18T16:24:21.627Z · LW(p) · GW(p)

I initially expected this to be related to the Kegan stages of development - like 1000 day monks needing to be on a higher Kegan stage - but in the end, I think this was a mistake. They are probably orthogonal - or only loosely correlated, right? 

 

Replies from: Duncan_Sabien
comment by [DEACTIVATED] Duncan Sabien (Duncan_Sabien) · 2022-02-18T18:35:45.392Z · LW(p) · GW(p)

I think likely orthogonal.  I think they only need to be correlated for monks thinking about or working on psychosocial problems, as opposed to e.g. pure math?

comment by Ben (ben-lang) · 2022-02-18T12:18:20.427Z · LW(p) · GW(p)

Nice ideas. It sounds like (without mentioning it directly) that you are thinking about publishing pressure in academia. Academics applying for a new job, or for funding that enables them to keep their current job, will typically provide a list of publications they produced. It goes without saying that a list containing entries {A, B, C} must be strictly better than the list merely {A, B}. So long lists are good. People whose funding requests/promotion applications fail often claim (to themselves and others) to be 1,000 day monks by your metric. Many of them are right, others have simply been unsuccessful 100 day monks (due to various factors, that may include ability but also luck, resources, coworkers etc).

comment by ACrackedPot · 2022-05-27T18:58:50.040Z · LW(p) · GW(p)

I get the impression, reading this and the way you and commenters classify people, that the magnitude of days is to some extent just equivalent to an evaluation of somebody's intellectual ability, and the internal complexity of their thoughts.

So if I said your article "Ruling Out Everything Else [LW · GW]" is the 10-day version of a 10000-day idea, you might agree, or you might disagree, but I must observe that if you agree, it will be taken as a kind of intellectual humility, yes?  And as we examine the notion of humility in this context, I think it should be noticed the implication of superiority; a 10000 day idea is a superior idea to a 10 day idea.  (Otherwise, what would there to be humble about?)  And if you felt like it was a more profound article than that, you'd find it somewhat offensive, I think.

...

Except that it is equally plausible that none of that is actually true, and that you're pointing at something else, and this interpretation is just one that wasn't ruled out.  If another equally plausible interpretation is correct: A 10-day monk is wrong more often than a 1-day monk, yes?  A 100-day monk is wrong more often than a 10-day monk?  The number of days matter; when other commenters point out that you need to bounce an idea off reality to avoid being wrong, are they criticizing your point, or glimpsing a piece of it?  Is it accurate to say that a significant piece of the idea represented here is that the number of days is in some sense equivalent to a willingness to be wrong about more?

Replies from: Duncan_Sabien
comment by [DEACTIVATED] Duncan Sabien (Duncan_Sabien) · 2022-05-27T20:25:20.874Z · LW(p) · GW(p)

equivalent to an evaluation of somebody's intellectual ability

No

equivalent to ... the internal complexity of their thoughts

Closer

I think there's going to be a higher minimum bar for higher magnitudes; I think that there are fewer people who can cut it wrestling with e.g. fundamental philosophical questions about the nature of existence (a 100,000+ day question) than there are who can cut it wrestling with e.g. questions of social coordination (a 10-100 day question in many cases).

But I think that there are a very large number of people who could, in principle, qualify to be higher-order monks, who instead apply prodigious intelligence to smaller questions one after the other all the time.

So, like, higher orders will have a higher density of smarter people, but there are ~equally upper-echelon smart people at all levels.

The 10-day version of a 10,000-day idea is an unusually valuable thing; as the old adage goes, "if I had had more time, I would have composed a shorter letter."  Distillations are difficult, especially distillations that preserve all of the crucial elements, rather than sacrificing them.

So to the extent that I might sometimes write 10-day distillations of 10,000-day ideas, this is a pretty > high-status claim, actually.  It's preserving the virtues of both orders.

A 10-day monk is wrong more often than a 1-day monk, yes? A 100-day monk is wrong more often than a 10-day monk?

It's more that they are wrong about different things, in systematically different ways.  A 10-day monk is right, about 10-day concerns viewed through 10-day ontologies, about as often as a 1-day monk or a 100-day monk, in their respective domains.

comment by Pattern · 2022-02-19T06:21:12.429Z · LW(p) · GW(p)
For monks working on such problems, it is less that they sequester themselves completely for thousands of days at a time and more that, during those thousands of days, none can make demands of them.)

This initially made it sound like maybe a group of monks would go in together. Or write each other letters.


If there were indeed 1,000 literal Duncan-copies available, to found a monastery or any other endeavor, they would immediately stratify themselves into 1, 10, 100, and 1,000-day groups at the very least, and probably there would be nonzero 10,000-day Duncans as well.
The key here is that each of these strata focuses on a set of largely non-overlapping issues, with largely non-overlapping assumptions.
To a 1-day or 10-day monk, questions like "maybe this is all a simulation, though"

It seems like it would be a simulation for the copies to exist, so, what's the fantasy work the original idea is from?


To a 1-day or 10-day monk, questions like "maybe this is all a simulation, though" are almost entirely meaningless.  They are fun to ponder at parties, but they aren't relevant to the actual working-out-of-how-things-work.  1-day and 10-day monks take reality as it seems to exist as a given, and are working within it to optimize for what seems good and useful.

Why not take a day, see if there's anything to it? Spending more time on something that might yield no results seems to make less sense.


"believes the science"

one would expect the overall design to perhaps change with time.

Do science is an entirely different thing than believe. If you disagree, then destroy it. What can stand is better for it.


It's also the kind of culture that ... effortlessly navigates disagreement about what's important?  You don't get criticisms of "ivory-tower nonsense" or "tunnel-visioned mundanity."  People in such a culture understand, on a deep and intuitive level, that some problems are 1000-day problems, and other problems are 1-day problems, and both are important, and both are important in very different ways.

Oh. I think that separating theory and practice can produce nonsense. Also, you're not going to 'fix cancer' after spending forever in a tower. Work is required. It makes more sense as a metaphor than a method. And a better method produces better results.

Overall 'wouldn't it be cool if LW was split into sections' has come up before. I think it's reasonable to say that, to a large extent, LW will be one website that works one way. A different way of thinking - this is pretty useful in that it's easy to implement.