Carrying the Torch: A Response to Anna Salamon by the Guild of the Rose

post by moridinamael · 2022-07-06T14:20:14.847Z · LW · GW · 16 comments

Contents

  Problem 1: 
  Problem 2: 
  Problem 3: 
  Problem 4: 
  Coda
None
16 comments

In a recent comment and followup post [LW · GW], Anna Salamon described some of the rocks upon which the Center for Applied Rationality has run aground, and invited anyone interested in the project of improving human rationality to pick up the torch.

We wanted to take this opportunity to remind you that the Guild of the Rose is here to carry that torch onward.

On a personal note, I figured I ought to also explicitly mention that the Guild is, in no small part, the result of many parties from all over the world reading and responding to the bat signal that I shined into the sky in 2018 [LW · GW]. In that post, an alien visitor with a rationality textbook exhorted us to dream a little bigger regarding the potential of the rationality movement. If that post spoke to you, then you ought to know it spoke to a lot of people, and, well, the Guild is what we're doing about it together.

And I think we're doing a pretty good job! Anna outlined a list of problems that she felt CFAR ran into, and I figured this would be a good place to describe how the Guild dealt with each of those problems.

Wait a minute – dealt with those problems? Anna just posted a few weeks ago!

When we started the Guild in 2020, we looked to CFAR as both an example to emulate and also differentiate ourselves from. We diagnosed many of the same problems that Anna describes in her post, though not necessarily in the same framing. We designed our organization to avoid those problems. We are grateful to CFAR for having pioneered this path.
 

Problem 1: 

Differentiating effective interventions from unfalsifiable woo.

The Guild focuses on actions and habits, not psychotherapy. I think ~0% of what we teach can be called unfalsifiable woo. Even when we tread more esoteric ground (e.g. the decision theory course) we focus on the practical and implementable.

To sketch a perhaps not-totally-generous metaphor, imagine there are two martial arts schools you're trying to choose between: 

One of these schools is esoteric, focuses on dogmatic cultivation of Warrior Spirit, and demands intensive meditation. This school promises a kind of transcendent self-mastery and hints that ki-blasts are not entirely off the table. 

The other school focuses on punching bags, footwork drills, takedowns, and sparring. This school promises that if you need to throw hands, you'll probably come out of it alive.

I think the vast majority of Human Potential Movement-adjacent organizations look more like the first school. Meanwhile, boring-looking organizations like the Boy Scouts of America, which focus almost entirely on the pragmatic practices of how to tie knots and start fires using sticks, probably succeed more at actually cultivating the "human potential" of their members.

Thus, the Guild focuses on the pragmatic. Our workshops cover effective, "boring" interventions like better nutritionusing your speaking voice more effectivelyimproving your personal financial organizationemergency preparedness, and implementing a knowledge management system, among many others. There is a new workshop almost every week.

Of course, we also teach what could be considered explicit rationality training. We have workshops focusing on epistemics, and on practical decision theory. But it's our belief that one way to become exceptional is to simply not be below-average at anything important. It is sort of embarrassing to focus extremely hard on "having correct beliefs" while not having a basic survival kit in your car.
 

Problem 2: 

Instructors having impure motives, and mistakes amplifying when it comes to rewiring your brain.

We have put some policies in place to mitigate against this kind of thing.

We mitigate against the risks of rewiring members' brains by not really trying to rewire members' brains, any more deeply than, say, learning partial differential equations rewires your brain. Learning about decision trees, or vocal control, might open your eyes to broader potentialities and unseen solutions to problems in your life, but doesn't undermine the bedrock of your identity.

Additionally, our approach is via a weekly 1-3 hour time commitment. Any "rewiring" will be done very very slowly and incrementally. I think a lot of the danger of X-rationality techniques comes from jumping directly into the deep-end and trying to refactor your whole psyche in a week. This is always a bad idea. We don't advocate trying to do this, and we don't do this in our workshops.

Finally, instructors simply aren't given all that much power or control. The leadership vets the course content before inflicting it upon the Guild at large. The leadership council are a varied and skeptical bunch. We don't allow content or exercises we're not comfortable with. There is also an implicit norm within the leadership that pushes back against what you might think of as a Slytherin approach to life. We've been cautious with content related to social skills, for example, because the potential for bad outcomes is higher.

 

Problem 3: 

Insufficient feedback between rationality training and the real world.

As stated above, we focus on smaller, more empirically useful pieces of rationality material, plus practical knowledge and skills. This low-hanging fruit is far from being picked. I sometimes feel that people signing up for a CFAR (or Tony Robbins workshops, or whatever) believe that they are  (metaphorically) already qualified to be UFC fighters and just need some kind of tuning to their mental game to win the championship. In reality, most people in this situation are likely not even above-average at punching and footwork.

Another way of saying this is that maybe your "mental game" is part of the problem, but probably the best way to improve your mental game is to improve your footwork while simultaneously establishing good self-talk. Approximately nobody ever thought themselves into being excellent at anything.

One of the many common Curses of Smart is being hypercompetent at a couple of tricks that you've adopted to be passably effective at life while being incredibly blind to your own limitations and bad habits. The Guild aims to bring all of a members' crucial competencies up to "average." To continue the martial arts metaphor, if you are not at least average in punching, takedowns, etc., then you probably have no business signing up for a cage fight. Further, you're probably wasting your time paying money to learn the Secrets of Inner Energy Harmonics at a mountain retreat.

Anna mentions in this section that CFAR tried to use "AI risk" as a kind of tether to reality, a kind of pragmatic testing ground. In the Guild the pragmatic testing ground is just, whatever is going on in your life. Our testing ground is: what to do about a broken car door handle, how much to spend on a haircut, and how to pitch your voice in a Zoom meeting. Your Mandatory Secret Identity [LW · GW] is your life and all the important stuff you've got going on. I hope it is obvious why this is a better reality tether.

 

Problem 4: 

Every cause wants to be a cult.

Early on in the game, we looked up the research on what exactly it is about cults that makes them cults, and then installed explicit policies to do the opposite of those things. For example, cults isolate people and encourage them to spend their energy on the cult. Part of the Guild rank-advancement structure encourages members to do service projects that are not connected to the Guild, e.g. service to some other physical or online group they consider themselves part of. Another example: cults are very demanding of members' time, making it difficult for them to maintain separate personal lives. The Guild, as a matter of policy, asks no more than 3 hours per week of members' time, and does not penalize absence. A final less serious example is that for important Council meetings or other serious Guild business, we wear silly hats, to mitigate against taking ourselves too seriously.

We also don't really put much demand on members' belief structure; you will find no mention of X-risk, AI alignment, or egregores in our course materials. I think this helps a bit with not being culty.

Also, "occult" literally means "hidden", and our course materials are freely available and totally transparent, for what that's worth.

I want to make a specific point here, because an uninformed observer might remark that the Guild looks more like a cult than CFAR does, because the Guild is a "group" of which you become a "member", while CFAR is just a school that you attend briefly and then leave. But this broad kind of criticism actually seems to round off to an assertion that "groups/organizations are bad," which I do not think most people would argue. In fact, I think "groups/organizations are (often) good" is the more supportable assertion!

People like being part of groups. Groups, with prosocial norms and requirements, are a public service provided to the commons; such groups motivate people to do good and useful things they probably would not otherwise have done. Being part of a group and identifying as a member motivates members to become better at whatever the group puts emphasis on. The Guild rank advancement system has demanding norms and requirements by design. Being a "person who attended a workshop" is not really being part of a group. Also, the Guild provides the Cohort structure, wherein each member is placed into a unit of ~6 people, with whom they primarily interact and work on course materials. Via the Cohorts, the Guild facilitates actual long-term friendship and community, which I would argue is an intrinsic good requiring no further justification beyond itself!

(As an aside, Guild members reliably remark that about half of the value of the Guild comes from the course content, and the other half comes from their relationships with their cohort. Make of this what you will.)
 

Coda

One takeaway from this post might be that the Guild of the Rose and CFAR are trying to do such different things that you have to wonder if we are even carrying the same torch. I think there is enough overlap regarding the crucial elements of our purpose that we probably are. The alien visitor with the rationality handbook would recognize both CFAR and the Guild of the Rose as worthy instantiations of the vision of Rationality Community, on the way to some shared terminus.

I ask at this point that you rely on the Virtue of Empiricism to validate my claims. First of all, our course material is all freely available and you can audit it at your whim. However, to give us a fair shake, I recommend that you join the Guild of the Rose via a 30-day free trial. I suggest this because the "active ingredient" of the course content is showing up and doing the exercises with your cohort, the small group of comrades you will be sorted into, and with which you will, very probably, become friends.


 

16 comments

Comments sorted by top scores.

comment by Valentine · 2022-07-06T23:24:02.679Z · LW(p) · GW(p)

I really like that you're doing this. Bravo. This is a type of approach I struggled to integrate into CFAR in its early days.

(I was a CFAR cofounder, by the way. In case that matters to you and you didn't know.)

A few notes, along the lines of "I imagine you might find something important to you if you were to look where I'm pointing":

  • I think you're missing something important about egregores. I get why you want to avoid that framing, but there's totally a way to approach the relevant parts from within reductionist materialism. Namely via evolutionary memetics. See David Deutsch's chapter on the evolution of culture in his book The Beginning of Infinity for the best public intro I know of. Or my 2020 stuff on "clockwork demons", though that's more "egregore" flavored; you'll probably want to translate that stuff.
  • You seem to be advocating virtues from the era of modernism. Which is great! That's core to LW-style rationality, really. Same thing that kicked off the European Enlightenment. But we're in the postmodern era now. This is important for several reasons:
    • People today tend to resonate more with postmodern themes than modern ones. Superman just isn't interesting anymore unless he has inner conflicts or the "villains" he's fighting actually have a really good point.
    • The kind of pushback you're likely to get will often look postmodern. If you don't really understand postmodernism (because it's the water the mainstream currently swims in, so it's a little tricky to spontaneously see), you might miss the real causes of the objections and confuse them for "woo".
    • Postmodernism does in fact level some meaningful critiques against modernism. If you don't understand those flaws, or if you try to address them with more modernism, you're going to trip over known rocks and possibly get slowly ignored into impotent obscurity — and frankly that would be correct because you wouldn't be reading the room.

Strangely enough, the best intro I've yet found to postmodernism is in a literary analysis of the (original) Harry Potter series — specifically John Granger's. His book Unlocking Harry Potter does a great job in just one chapter via analyzing the silly Disney film "Sky High" for its postmodern narrative structure, pointing out how that narrative has become the standard story of our times.

…with a caveat: Folk seem to be getting tired of everything getting deconstructed. I think we might be at the tail end of the postmodern era. But it'll still be important context for whatever comes next, just as modernism is important context for what's going on today.

Replies from: moridinamael, sharmake-farah
comment by moridinamael · 2022-07-08T18:06:20.971Z · LW(p) · GW(p)

Totally get where you're coming from and we appreciate the feedback. I personally regard memetics as an important concept to factor into a big-picture-accurate epistemic framework. The landscape of ideas is dynamic and adversarial. I personally view postmodernism as a specific application of memetics. Or memetics as a generalization of postmodernism, historically speaking. Memetics avoids the infinite regress of postmodernism by not really having an opinion about "truth." Egregores are a decent handle on feedback-loop dynamics of the idea landscape, though I think there are risks to reifying egregores as entities.

My high-level take is that CFAR's approach to rationality training has been epistemics-first and the Guild's approach has been instrumental-first. (Let me know if this doesn't reflect reality from your perspective.) In our general approach, you gradually improve your epistemics in the course of improving your immediate objective circumstances, according to each individual's implicit local wayfinding intuition. In other words, you work on whatever current-you judges to be currently-critical/achievable. This may lead to spending some energy pursuing goals that haven't been rigorously linked up to an epistemically grounded basis, that future-you won't endorse, but at least this way folks are getting in the reps, as it were. It's vastly better than not having a rationality practice at all.

In my role an art critic I have been recently noticing how positively people have reacted to stuff like Top Gun: Maverick, a film which is exactly what it appears to be, aggressively surface-level, just executing skillfully on a concept. This sort of thing causes me to directionally agree that the age of meta and irony may be waning. Hard times push people to choose to focus on concrete measurables, which you could probably call "modernist."

Replies from: Valentine
comment by Valentine · 2022-07-10T19:18:06.762Z · LW(p) · GW(p)

Great, glad you appreciate it.

I personally regard memetics as an important concept to factor into a big-picture-accurate epistemic framework.

Reassuring to hear. At this point I'm personally quite convinced that attempts to deal with epistemics in a way that ignores memetics are just doomed.

I personally view postmodernism as a specific application of memetics. Or memetics as a generalization of postmodernism, historically speaking.

I find this weird, kind of like saying that medicine is a specific application of physics. It's sort of technically correct, and can be helpful if you're very careful, but seems like it risks missing the boat entirely.

Postmodernism totally is a memeplex, but it's a special kind that almost entirely focuses on shaping the evolutionary terrain for all other memes. Many memes try to do that, but… well, for instance, atheism became possible because of modernism. And cancel culture became possible because of postmodernism. Many memes try to do this terrain thing, but the thing defining (post)modernism as interesting is the depth.

I mean, the fear that things might turn into cults comes from postmodernism. And you're subject to that fear such that you had to address it in your OP.

I get the sense that you're pretty aware of these dynamics. I just want to emphasize that while (post)modernism is indeed something like a special case of memetics, I think it deserves some special attention since it's affecting the context in which you're trying to do memetics.

My high-level take is that CFAR's approach to rationality training has been epistemics-first and the Guild's approach has been instrumental-first. (Let me know if this doesn't reflect reality from your perspective.)

Well… mmm… it doesn't quite. It's an… okay-ish first approximation though.

CFAR wanted to focus on epistemic rationality, but no one was interested in practice. We kind of had to sneak our best guesses about epistemic rationality in the back via what amounted to self-help techniques.

"Oh, you have trouble motivating yourself to do extra work? Rather than just jumping in with a hack, let's see if we can explore why you're having trouble. Oh, oops, looks like we just dissolved your whole reason for doing the task in the first place."

Our measures of success weren't really things like whether people started and kept to exercise programs. We were way more interested in whether they were getting clear insights and rearranging their lives in ways that make deep sense. We could never clearly define this but we had the illusion of a shared-ish intuition here.

So in terms of our target, I guess it was kind of epistemics-first?

But I think if we had been really serious about getting epistemics right, we would have done something quite a bit different. A lot of what we did was based on us fitting to the constraints of being (a) entertaining and (b) at least seemingly effective.

In retrospect I think CFAR dramatically failed to take Goodhart nearly seriously enough.

(That, by the way, would be my one main pithy warning to anyone trying to do a CFAR adjacent thing: Nothing you do will matter in the long run if you don't sort out Goodhart drift. Really, truly, I advise taking that super seriously, and not becoming complacent by focusing on your confidence that you've solved it well enough.)

comment by Noosphere89 (sharmake-farah) · 2022-07-07T01:24:04.204Z · LW(p) · GW(p)

Postmodernism's most useful critiques on Modernism is that there is a single objective morality, when that's almost certainly not the case. It comes with a cultural relativism claim that a morality of a culture isn't wrong, just conflicting to your morals. And this is also probably right. What that means is that cultural norms and morality, as well as individuals have no objective standard of right and wrong, just their own choices and consequences.

It's also nice that they remind people that your opposition probably does have a point. It can be taken too far, but it is a good guideline given how much we demonize our enemies.

I have serious criticisms to make of postmodern thought, especially in philosophy where they took it fully unflitered, but it does make some good criticisms of modernism.

Replies from: Valentine, Vanilla_cabs
comment by Valentine · 2022-07-07T16:07:45.913Z · LW(p) · GW(p)

I have serious criticisms to make of postmodern thought, especially in philosophy where they took it fully unflitered, but it does make some good criticisms of modernism.

Oh, I have plentiful criticism to level on postmodernism too. The main one being how it's self-referentially inconsistent and uses that in a motte-and-bailey fashion.

I mean, if all truths are relative, is that only true in some contexts? Or is it absolutely true? That has the same logical structure as "This sentence is false."

Likewise with being utterly intolerant of intolerance. So which intolerance shall we allow? Absolutely none? Oops. But surely we can just be smart about it and pick and choose which forms of intolerance are really only directed at intolerance, right? That can't possibly be weaponized in a way that creates division in society! :-/

Postmodernism adds a twist of self-mockery as though to acknowledge this. But that gets taken as a sign of being Truly Humble which frees them of scrutiny of their Grand Narrative that all Grand Narratives are relative and that all evil comes from believing that one of them is absolutely true.

But ha ha don't take this too seriously.

But also Cancel Culture.

:-/

comment by Vanilla_cabs · 2022-07-09T17:46:00.432Z · LW(p) · GW(p)

It comes with a cultural relativism claim that a morality of a culture isn't wrong, just conflicting to your morals. And this is also probably right.

How can this work? Cultures change. So which is morally right, the culture before the change, or the culture after the change?

I guess a reply could be "Before the change, the culture before the change is right. After the change, the culture after the change is right." But in this view, "being morally right" carries no information. We cannot assess whether a culture deserves to be changed based on this view.

Replies from: calef
comment by calef · 2022-07-10T01:47:38.090Z · LW(p) · GW(p)

Probably one of the core infohazards of postmodernism is that “moral rightness” doesn’t really exist outside of some framework. Asking about “rightness” of change is kind of a null pointer in the same way self-modifying your own reward centers can’t be straightforwardly phrased in terms of how your reward centers “should” feel about such rewiring.

comment by philip_b (crabman) · 2022-07-06T17:24:22.329Z · LW(p) · GW(p)

I just want to say that your described solution to "Problem 1: Differentiating effective interventions from unfalsifiable woo" suggests to me that your curriculum would be mostly useless for me, and maybe for many other people as well, because it won't go deep enough. I think either I've already gotten everything I can get from shallow interventions "like better nutrition, using your speaking voice more effectively, improving your personal financial organization, emergency preparedness, and implementing a knowledge management system", or they were never that good in the first place. Personally, I am focusing on psychotherapy right now. It's unfortunate that it consists mostly of borderline-unfalsifiable woo but that's all we've got.

Replies from: moridinamael
comment by moridinamael · 2022-07-06T17:42:16.265Z · LW(p) · GW(p)

You're may be right, but I would suggest looking through the full list of workshops and courses. I was merely trying to give an overall sense of the flavor of our approach, not give an exhaustive list. The Practical Decision-Making course would be an example of content that is distinctly "rationality-training" content. Despite the frequent discussions of abstract decision theory that crop up on LessWrong, practically nobody is actually able to draw up a decision tree for a real-world problem, and it's a valuable skill and mental framework. 

I would also mention that a big part of the benefit of the cohort is to have "rationality buddies" off whom you can bounce your struggles. Another Curse of Smart is thinking that you need to solve every problem yourself.

comment by frankybegs · 2022-07-08T12:04:45.786Z · LW(p) · GW(p)

Just wanted to say I signed up for a trial on the strength of this pitch, so well done! It sounds like something that could be really useful for me.

comment by SarahNibs (GuySrinivasan) · 2022-07-06T16:44:18.336Z · LW(p) · GW(p)

One of the many common Curses of Smart is being hypercompetent at a couple of tricks that you've adopted to be passably effective at life while being incredibly blind to your own limitations and bad habits.

 

Just want to drop a note here that this curse (a) got me through years of major depression, making me, I guess, "high-functioning", and (b) caused the worst interpersonal crisis I've had-or-expect-to-have in my life.

For me it wasn't really a trick, per se. Just, more like... being smart enough allows you to simply brute force a bunch of areas without ever being good at them, and it feels like good enough because "passable effective at almost anything while concentrating" is legit better than median. The main failure mode when phrased like this, though, should be quite obvious - you can only concentrate on so much. The secondary failure mode is that even when concentrating, if you don't have good heuristics born of experience actually getting good at a thing, your passable effectiveness is brittle even when you think you are concentrating, because it has bad default behaviors in the gaps of things you don't know should be part of your concentration.

(I am not affiliated with any of these orgs. I did attend a pre-CFAR proto workshop thingy.)

Replies from: moridinamael
comment by moridinamael · 2022-07-06T17:58:35.981Z · LW(p) · GW(p)

This sort of thing is so common that I would go so far as to say is the norm, rather than the exception. Our proposed antidote to this class of problem is to attend the monthly Level Up Sessions, and simply making a habit of regularly taking inventory of the bugs (problems and inefficiencies) in your day-to-day life and selectively solving the most crucial ones. This approach starts from the mundane and eventually builds up your environment and habits, until eventually you're no longer relying entirely on your "tricks."

comment by RobertM (T3t) · 2022-07-06T19:13:36.047Z · LW(p) · GW(p)

Mod note: I've decided to mark this as a Personal post, since we generally don't frontpage organizational announcements and it feels a bit like a sales pitch.  In the future I'd also be interested in reading about what you've learned as an organization about e.g. teaching epistemics, which would better meet Frontpage guidelines ("aim to explain, rather than persuade").

Replies from: Raemon
comment by Raemon · 2022-07-06T19:15:08.293Z · LW(p) · GW(p)

(Meta-Mod Note: RobertM recently joined the LessWrong team)

comment by ChrisHibbert · 2022-07-07T15:19:55.627Z · LW(p) · GW(p)

Silly hats are commonly associated with some cults and secret societies, so that's not particularly a mark in your favor. "not taking yourselves too seriously" is a plus, but neither dress code nor anti-dress code will get you there.

Replies from: moridinamael
comment by moridinamael · 2022-07-07T17:39:18.320Z · LW(p) · GW(p)

To be clear ... it's random silly hats, whatever hats we happen to have on hand. Not identical silly hats. Also this is not really a load bearing element of our strategy. =)