CFAR Participant Handbook now available to all 2020-01-03T15:43:44.618Z
Willing to share some words that changed your beliefs/behavior? 2019-03-23T02:08:37.437Z
In My Culture 2019-03-07T07:22:42.982Z
Double Crux — A Strategy for Resolving Disagreement 2017-01-02T04:37:25.683Z


Comment by duncan_sabien on Words and Implications · 2020-10-02T07:58:08.527Z · LW · GW

This largely rings true to me but is missing one (in my opinion) absolutely crucial caveat/complication:

Most people (including, as experience has repeatedly confirmed, the vast majority of rationalists/LWers) will do "Ask what physical process generated the words. Where did they come from? Why these particular words at this particular time?" wrong, by virtue of being far too confident in the first answer that their stereotype centers generate, and not accounting for other-people's-minds-and-culture-being-quite-different-from-their-own.

Comment by duncan_sabien on CFAR Participant Handbook now available to all · 2020-01-14T23:00:56.729Z · LW · GW

As for the Understanding Shoulds section, that's another example of the document being tailor-made for a specific target audience; most people are indeed "taking far too seriously" their "utterly useless shoulds," but the CFAR workshop audience was largely one pendulum swing ahead of that state, and needing the next round of iterated advice.

Comment by duncan_sabien on CFAR Participant Handbook now available to all · 2020-01-14T22:59:37.834Z · LW · GW

Emailing CFAR is the best way to find out; previously the question wasn't considered in depth because "well, we're not selling it, and we're also not sharing it." Now, the state is "well, they're not selling it, but they are sharing it," so it's unclear.

(Things like the XKCD comic being uncited came about because in context, something like 95% of participants recognized XKCD immediately and the other 5% were told in person when lecturers said stuff like "If you'll look at the XKCD comic on page whatever..." In other words, it was treated much more like an internal handout shared among a narrowly selected, high-context group, than as product that needed to dot all of the i's and cross all of the t's. I agree that Randall Munroe deserves credit for his work, and that future edits would likely correct things like that.)

Comment by duncan_sabien on CFAR Participant Handbook now available to all · 2020-01-14T19:19:41.197Z · LW · GW

Emailing people at CFAR directly is the best way to find out, I think (I dunno how many of them are checking this thread).

Comment by duncan_sabien on CFAR Participant Handbook now available to all · 2020-01-08T19:27:07.698Z · LW · GW

Note that this handbook covers maybe only about 2/3 of the progress made in that private beta branch, with the remaining third divided into "happened while I was there but hasn't been written up (hopefully 'yet')" and "happened since my departure, and unclear whether anyone will have the time and priority to export it."

Comment by duncan_sabien on CFAR Participant Handbook now available to all · 2020-01-04T05:29:03.125Z · LW · GW

I don't know the answer; the team made their decision and then checked to see if I was okay with it; I wasn't a part of any deliberations or discussions.

Comment by duncan_sabien on Meta-discussion from "Circling as Cousin to Rationality" · 2020-01-04T01:44:19.783Z · LW · GW

What I meant by the word "our" was "the broader context culture-at-large," not Less Wrong or my own personal home culture or anything like that. Apologies, that could've been clearer.

I think there's another point on the spectrum (plane?) that's neither "overt anti-intellectualism" nor "It seems to me that engaging with you will be unproductive and I should disengage." That point being something like, "It's reasonable and justified to conclude that this questioning isn't going to be productive to the overall goal of the discussion, and is either motivated-by or will-result-in some other effect entirely."

Something stronger than "I'm disengaging according to my own boundaries" and more like "this is subtly but significantly transgressive, by abusing structures that are in place for epistemic inquiry."

If the term "sealioning" is too tainted by connotation to serve, then it's clearly the wrong word to use; TIL. But I disagree that we don't need or shouldn't have any short, simple handle in this concept space; it still seems useful to me to be able to label the hypothesis without (as Oliver did) having to write words and words and words and words. The analogy to the usefulness of the term "witchhunt" was carefully chosen; it's the sort of thing that's hard to see at first, and once you've put forth the effort to see it, it's worth ... idk, cacheing or something?

Comment by duncan_sabien on Meta-discussion from "Circling as Cousin to Rationality" · 2020-01-03T18:12:30.830Z · LW · GW

I agree that you've said this multiple times, in multiple places; I wanted you to be able to say it shortly and simply. To be able to do something analogous to saying "from where I'm currently standing, this looks to me like a witchhunt" rather than having to spell out, in many different sentences, what a witchhunt is and why it's bad and how this situation resembles that one.

My caveats and hedges were mainly not wanting to be seen as putting words in your mouth, or presupposing your endorsement of the particular short sentence I proposed.

Comment by duncan_sabien on Meta-discussion from "Circling as Cousin to Rationality" · 2020-01-03T16:39:55.792Z · LW · GW

I note that we, as a culture, have reified a term for this, which is "sealioning."

Naming the problem is not solving the problem; sticking a label on something is not the same as winning an argument; the tricky part is in determining which commentary is reasonably described by the term and which isn't (and which is controversial, or costly-but-useful, and so forth).

But as I read through this whole comment chain, I noticed that I kept wanting Oliver to be able to say the short, simple sentence:

"My past experience has led me to have a prior that threads from you beginning like this turn out to be sealioning way more often than similar threads from other people."

Note that that's my model of Oliver; the real Oliver has not actually expressed that [edit: exact] sentiment [edit: in those exact words] and may have critical disagreements with my model of him, or critical caveats regarding the use of the term.

Comment by duncan_sabien on We run the Center for Applied Rationality, AMA · 2019-12-22T05:31:28.080Z · LW · GW

(I expect the answer to 2 will still be the same from your perspective, after reading this comment, but I just wanted to point out that not all influences of a CFAR staff member cash out in things-visible-in-the-workshop; the part of my FB post that you describe as 2 was about strategy and research and internal culture as much as workshop content and execution. I'm sort of sad that multiple answers have had a slant that implies "Duncan only mattered at workshops/Duncan leaving only threatened to negatively impact workshops.")

Comment by duncan_sabien on We run the Center for Applied Rationality, AMA · 2019-12-22T04:12:03.743Z · LW · GW

On reading Anna's above answer (which seems true to me, and also satisfies a lot of the curiosity I was experiencing, in a good way), I noted a feeling of something like "reading this, the median LWer will conclude that my contribution was primarily just ops-y and logistical, and the main thing that was at threat when I left was that the machine surrounding the intellectual work would get rusty."

It seems worth noting that my model of CFAR (subject to disagreement from actual CFAR) is viewing that stuff as a domain of study, in and of itself—how groups cooperate and function, what makes up things like legibility and integrity, what sorts of worldview clashes are behind e.g. people who think it's valuable to be on time and people who think punctuality is no big deal, etc.

But this is not necessarily something super salient in the median LWer's model of CFAR, and so I imagine the median LWer thinking that Anna's comment means my contributions weren't intellectual or philosophical or relevant to ongoing rationality development, even though I think Anna-and-CFAR did indeed view me as contributing there, too (and thus the above is also saying something like "it turned out Duncan's disappearance didn't scuttle those threads of investigation").

Comment by duncan_sabien on We run the Center for Applied Rationality, AMA · 2019-12-22T03:32:44.879Z · LW · GW

In general, if you don't understand what someone is saying, it's better to ask "what do you mean?" than to say "are you saying [unrelated thing that does not at all emerge from what they said]??" with double punctuation.

Comment by duncan_sabien on We run the Center for Applied Rationality, AMA · 2019-12-21T19:39:58.346Z · LW · GW

They do. The distinction seems to me to be something like endorsement of a "counting up" strategy/perspective versus endorsement of a "counting down" one, or reasonable disagreement about which parts of the dog food are actually beneficial to eat at what times versus which ones are Goodharting or theater or low payoff or what have you.

Comment by duncan_sabien on We run the Center for Applied Rationality, AMA · 2019-12-21T19:23:05.926Z · LW · GW

I'm not saying that, either.

I request that you stop jumping to wild conclusions and putting words in people's mouths, and focus on what they are actually saying.

Comment by duncan_sabien on We run the Center for Applied Rationality, AMA · 2019-12-21T10:43:19.666Z · LW · GW

That's not the question that was asked, so ... no.

Edit: more helpfully, I found it valuable for thinking about rationality and thinking about CFAR from a strategic perspective—what it was, what it should be, what problems it was up against, how it interfaced with the rest of society.

Comment by duncan_sabien on We run the Center for Applied Rationality, AMA · 2019-12-21T10:33:42.484Z · LW · GW

I'm reading the replies of current CFAR staff with great interest (I'm a former staff member who ended work in October 2018), as my own experience within the org was "not really; to some extent yes, in a fluid and informal way, but I rarely see us sitting down with pen and paper to do explicit goal factoring or formal double crux, and there's reasonable disagreement about whether that's good, bad, or neutral."

Comment by duncan_sabien on We run the Center for Applied Rationality, AMA · 2019-12-21T10:10:16.311Z · LW · GW

Historically, CFAR had the following concerns (I haven't worked there since Oct 2018, so their thinking may have changed since then; if a current staff member gets around to answering this question you should consider their answer to trump this one):

  • The handbook material doesn't actually "work" in the sense that it can change lives; the workshop experience is crucial to what limited success CFAR *is* able to have, and there's concern about falsely offering hope
  • There is such a thing as idea inoculation; the handbook isn't perfect and certainly can't adjust itself to every individual person's experience and cognitive style. If someone gets a weaker, broken, or uncanny-valley version of a rationality technique out of a book, not only may it fail to help them in any way, but it will also make subsequently learning [a real and useful skill that's nearby in concept space] correspondingly more difficult, both via conscious dismissiveness and unconscious rounding-off.
  • To the extent that certain ideas or techniques only work in concert or as a gestalt, putting the document out on the broader internet where it will be chopped up and rearranged and quoted in chunks and riffed off of and likely misinterpreted, etc., might be worse than not putting it out at all.
Comment by duncan_sabien on We run the Center for Applied Rationality, AMA · 2019-12-21T10:04:20.470Z · LW · GW

[Disclaimer: have not been at CFAR since October 2018; if someone currently from the org contradicts this, their statement will be more accurate about present-day CFAR]

No (CFAR's mission has always been narrower/more targeted) and no (not in any systematic, competent fashion).

Comment by duncan_sabien on We run the Center for Applied Rationality, AMA · 2019-12-21T10:01:26.878Z · LW · GW

In case no one who currently works at CFAR gets around to answering this (I was there from Oct 2015 to Oct 2018 in a pretty influential role but that means I haven't been around for about fourteen months):

  • Meditations on Moloch is top of the list by a factor of perhaps four
  • Different Worlds as a runner up

Lots of social dynamic stuff/how groups work/how individuals move within groups:

  • Social Justice and Words, Words, Words
  • I Can Tolerate Anything Except The Outgroup
  • Guided By The Beauty Of Our Weapons
  • Yes, We Have Noticed The Skulls
  • Book Review: Surfing Uncertainty
Comment by duncan_sabien on We run the Center for Applied Rationality, AMA · 2019-12-21T09:50:51.277Z · LW · GW

I'd be curious for an answer to this one too, actually.

Comment by duncan_sabien on We run the Center for Applied Rationality, AMA · 2019-12-21T09:45:07.272Z · LW · GW
that CFAR's natural antibodies weren't kicking against it hard.

Some of them were. This was a point of contention in internal culture discussions for quite a while.

(I am not currently a CFAR staff member, and cannot speak to any of the org's goals or development since roughly October 2018, but I can speak with authority about things that took place from October 2015 up until my departure at that time.)

Comment by duncan_sabien on CFAR: Progress Report & Future Plans · 2019-12-19T07:22:31.755Z · LW · GW

This is a good guess on priors, but in my experience (Oct 2015 - Oct 2018, including taking over the role of a previous burnout, and also leaving fairly burnt), it has little to do with ops capacity or ops overload.

Comment by duncan_sabien on G Gordon Worley III's Shortform · 2019-09-13T05:47:53.251Z · LW · GW

Yeah, if I had the comment to rewrite (I prefer not to edit it at this point) I would say "My whole objection is that Gordon wasn't bothering to (and at this point in the exchange I have a hypothesis that it's reflective of not being able to, though that hypothesis comes from gut-level systems and is wrong-until-proven-right as opposed to, like, a confident prior)."

Comment by duncan_sabien on G Gordon Worley III's Shortform · 2019-09-13T05:44:19.954Z · LW · GW

I'm not sure I'm exactly responding to what you want me to respond to, but:

It seems to me that a declaration like "I think this is true of other people in spite of their claims to the contrary; I'm not even sure if I could justify why? But for right now, that's just the state of what's in my head"

is not objectionable/doesn't trigger the alarm I was trying to raise. Because even though it fails to offer cruxes or detail, it at least signals that it's not A STATEMENT ABOUT THE TRUE STATE OF THE UNIVERSE, or something? Like, it's self-aware about being a belief that may or may not match reality?

Which makes me re-evaluate my response to Gordon's OP and admit that I could have probably offered the word "think" something like 20% more charity, on the same grounds, though on net I still am glad that I spelled out the objection in public (like, the objection now seems to me to apply a little less, but not all the way down to "oops, the objection was fundamentally inappropriate").

Comment by duncan_sabien on G Gordon Worley III's Shortform · 2019-09-12T08:34:55.065Z · LW · GW

I find it morally abhorrent because, when not justified and made-cruxy (i.e. when done the only way I've ever seen Gordon do it), it's tantamount to trying to erase another person/another person's experience, and (as noted in my first objection) it often leads, in practice, to socially manipulative dismissiveness and marginalization that's not backed by reality.

Comment by duncan_sabien on G Gordon Worley III's Shortform · 2019-09-12T08:32:46.114Z · LW · GW
when and only when it is in fact the case that I know better than those other people what's going on in their heads (in accordance with the Litany of Tarski).

Yes, as clearly noted in my original objection, there is absolutely a time and a place for this, and a way to do it right; I too share this tool when able and willing to justify it. It's only suspicious when people throw it out solely on the strength of their own dubious authority. My whole objection is that Gordon wasn't bothering to (I believe as a cover for not being able to).

Comment by duncan_sabien on G Gordon Worley III's Shortform · 2019-09-12T00:42:13.684Z · LW · GW

Oh, one last footnote: at no point did I consider the other conversation private, at no point did I request that it be kept private, and at no point did Gordon ask if he could reference it (to which I would have said "of course you can"). i.e. it's not out of respect for my preferences that that information is not being brought in this thread.

Comment by duncan_sabien on G Gordon Worley III's Shortform · 2019-09-12T00:21:46.179Z · LW · GW

I note for the record that the above is strong evidence that Gordon was not just throwing an offhand turn of phrase in his original post; he does and will regularly decide that he knows better than other people what's going on in those other people's heads. The thing I was worried about, and attempting to shine a light on, was not in my imagination; it's a move that Gordon endorses, on reflection, and it's the sort of thing that, historically, made the broader culture take forever to recognize e.g. the existence of people without visual imagery, or the existence of episodics, or the existence of bisexuals, or any number of other human experiences that are marginalized by confident projection.

I'm comfortable with just leaving the conversation at "he, I, and LessWrong as a community are all on the same page about the fact that Gordon endorses making this mental move." Personally, I find it unjustifiable and morally abhorrent. Gordon clearly does not. Maybe that's the crux.

Comment by duncan_sabien on G Gordon Worley III's Shortform · 2019-09-11T22:05:19.224Z · LW · GW

I explicitly reject Gordon's assertions about my intentions as false, and ask (ASK, not demand) that he justify (i.e. offer cruxes) or withdraw them.

Comment by duncan_sabien on G Gordon Worley III's Shortform · 2019-09-11T21:07:21.585Z · LW · GW

I can have different agendas and follow different norms on different platforms. Just saying. If I were trying to do the exact same thing in this thread as I am in the FB thread, they would have the same words, instead of different words.

(The original objection *does* contain the same words, but Gordon took the conversation in meaningfully different directions on the two different platforms.)

I note that above, Gordon is engaging in *exactly* the same behavior that I was trying to shine a spotlight on (claiming to understand my intent better than I do myself/holding to his model that I intend X despite my direct claims to the contrary).

Comment by duncan_sabien on G Gordon Worley III's Shortform · 2019-09-11T15:28:38.229Z · LW · GW

"Here are some things you're welcome to do, except if you do them I will label them as something else and disagree with them."

Your claim that you had tentative conclusions that you were willing to update away from is starting to seem like lip service.

I am currently reading your intent as not one of informing me of that you think there is a norm that should be enforced

Literally my first response to you centers around the phrase "I think it's a good and common standard to be skeptical of (and even hostile toward) such claims." That's me saying "I think there's a norm here that it's good to follow," along with detail and nuance à la here's when it's good not to follow it.

Comment by duncan_sabien on G Gordon Worley III's Shortform · 2019-09-11T07:12:50.208Z · LW · GW

There's a world of difference between someone saying "[I think it would be better if you] cut it out because I said so" and someone saying "[I think it would be better if you] cut it out because what you're doing is bad for reasons X, Y, and Z." I didn't bother to spell out that context because it was plainly evident in the posts prior. Clearly I don't have any authority beyond the ability to speak; to

claim or argue that I am dangerous in some way

IS what I was doing, and all I was doing.

Comment by duncan_sabien on G Gordon Worley III's Shortform · 2019-09-11T04:01:38.675Z · LW · GW

This response missed my crux.

What I'm objecting to isn't the shortform, but the fundamental presumptuousness inherent in declaring that you know better than everyone else what they're experiencing, *particularly* in the context of spirituality, where you self-describe as more advanced than most people.

To take a group of people (LWers) who largely say "nah, that stuff you're on is sketchy and fake" and say "aha, actually, I secretly know that you're in my domain of expertise and don't even know it!" is a recipe for all sorts of bad stuff. Like, "not only am I *not* on some sketchy fake stuff, I'm actually superior to my naysayers by very virtue of the fact that they don't recognize what I'm pointing at! Their very objection is evidence that I see more clearly than they do!"

I'm pouring a lot into your words, but the point isn't that your words carried all that so much as that they COULD carry all that, in a motte-and-bailey sort of way. The way you're saying stuff opens the door to abuse, both social and epistemic. My objection wasn't actually a call for you to give more explanation. It was me saying "cut it out," while at the same time acknowledging that one COULD, in principle, make the same claim in a justified fashion, if they cared to.

Comment by duncan_sabien on G Gordon Worley III's Shortform · 2019-09-11T01:21:09.900Z · LW · GW
Only, I don't think you're not having it, you just don't realize you are having those experiences.

The mentality that lies behind a statement like that seems to me to be pretty dangerous. This is isomorphic to "I know better than other people what's going on in those other people's heads; I am smarter/wiser/more observant/more honest."

Sometimes that's *true.* Let's not forget that. Sometimes you *are* the most perceptive one in the room.

But I think it's a good and common standard to be skeptical of (and even hostile toward) such claims (because such claims routinely lead to unjustified and not-backed-by-reality dismissal and belittlement and marginalization of the "blind" by the "seer"), unless they come along with concrete justification:

  • Here are the observations that led me to claim that all people do in fact experience X, in direct contradiction of individuals claiming otherwise; here's why I think I'm correct to ignore/erase those people's experience.
  • Here are my causal explanations of why and how people would become blindspotted on X, so that it's not just a blanket assertion and so that people can attempt to falsify my model.
  • Here are my cruxes surrounding X; here's what would cause me to update that I was incorrect in the conclusions I was reaching about what's going on in other people's heads

... etc.

Comment by duncan_sabien on Willing to share some words that changed your beliefs/behavior? · 2019-03-25T16:59:13.415Z · LW · GW

I like this breakdown a lot. I agree this particular methodology fails to distinguish those differences; this is more a first-glance 80/20 to determine whether a better and more rigorous investigation is even warranted.

Comment by duncan_sabien on In My Culture · 2019-03-12T16:27:17.115Z · LW · GW

I want to think further and also want to answer you now, so: knee-jerk response without too much thought is something like "there's a class of cultural values that this framing is insufficient to help you talk about, but it feels to me like a piece of the puzzle that lets you bridge the gap."

i.e. I agree there are ways this can be counterproductive for whole categories of important communication. But I'd probably route through this thing anyway, given my current state of knowledge?

Would not be surprised to find myself talked out of this viewpoint.

Comment by duncan_sabien on In My Culture · 2019-03-12T00:03:16.496Z · LW · GW

This makes sense to me.

Similar caveats as Ray's re: this is more fraught, since here I am trying to describe my observations of the context culture, as opposed to things I'm relatively sure about because they live inside my head. These are not normative statements/shoulds, they're just "in my experience"s.

it's necessary for some kind of social move in the space of "I think you're more confused or blind-spotted than you realize", at least some times.

Strong agree. It seems to me that the additional bit that makes this prosocial instead of a weapon is something like:

I notice that I've got a hypothesis forming, that you're more confused or blind-spotted than you realize. I started to form this hypothesis when I saw X, Y, and Z, which I interpreted to mean A, B, and C. This hypothesis causes me to predict that, if I hadn't said anything, you would've responded to M with N, which would've been miscalibrated for reasons 1 and 2. If I saw you doing G, I would definitely update away from this hypothesis, and certainly G is not the only thing that would shift me. I want to now be open to hearing your response or counterargument; this is not a mic drop.

... where the two key pieces of the above are:

1) distinguishing between a hypothesis and a fact, or between a claim and an assertion. It seems non-rude and at least possibly non-aggressive/non-invalidating/non-weaponized to say "I'm considering [your blindness/biased-ness] among many possibilities," whereas it seems pretty much guaranteed to be taken-as-rude or taken-as-an-attempt-to-delegitimize to just flatly state "Yeah, you're [blind/biased]."

2) creating surface area/showing the gears of your hypothesis/sticking your neck out and making what you've said falsifiable. There are hints of cruxes not only in G, but also in X, Y, and Z, which someone may convincingly argue you misunderstood or misinterpreted or misremembered.

In the swath of the EA/rationalist community that I have the most exposure to (i.e. among the hundred or so Berkelanders that I've interacted with in the past year) the social move of having a hypothesis is one that is acceptable when used with clear care and respect, and the social move of claiming to know is one that is frowned upon. In other words, I've seen people band together in rejection of the latter, and I've heard many different people on many different occasions say things like my fake quote paragraph above.

This also seems to me to be correct, and is part of what I came here for (where "here" is the rationalist community). I notice that my expectation of such (in swathes of the community where that is not the norm) has gotten me into fights, in the past.

Comment by duncan_sabien on In My Culture · 2019-03-11T05:41:13.862Z · LW · GW

(To put this another way: it seems like you missed an important part of the thesis of the piece*, which is that there are no interactions between two people with the exact same culture. While it is in fact the case that some people work differently (e.g. Scott's discussion of high-trust vs. low-trust cultures) and will reliably hear you to be making claims about the context culture if you're not extremely exact, and therefore it's important to be clear and careful and say a few more words to delineate your claims about the context culture from your claims about your own personal sense of what-is-ideal ...

... while it seems true that you should take that into account, on a practical level, it seems that if you have done all that work, and someone reacts hostilely to you as if you are making some other claim ...

... as far as I can tell, in the Berkeley rationalist context culture, the one that most of us agree upon so we can get along with each other, the person who sort of ... refused to believe that I meant what I said? ... is the one who's doing something hostile.

Or at least, it seems to me that there's a principle of "don't claim you understand better than others what's going on in their heads" in the shared context of people you and I hang out with. But maybe I'm mistaken? Maybe this is not the case, and in fact that is just another piece of my personal culture?

*or you didn't miss it yourself, but you're pointing out that it's subtle and therefore it gets missed in practice a lot

Comment by duncan_sabien on In My Culture · 2019-03-11T05:27:23.920Z · LW · GW

Did you see Zeit Polizei's comment above? That was super productive for me, on this axis. For instance, taking into consideration (both before and after attempting to make this move) the degree to which the other person's culture is one that leans toward uncharitable or defensive interpretations of what the other person was saying.

Also, it seems in your description of people getting heated that there's no clear distinction being made between claims about one's personal culture and claims about the context culture—the "I understand this community better than you" is triggerable by this tool if you're not careful, but it's not actually the claim I'm making if I say "in my culture."

Comment by duncan_sabien on In My Culture · 2019-03-11T05:25:44.381Z · LW · GW

Oooooh, I like this a lot. In particular, this resolves for me a bit of tension about why I liked the above comment and also disagreed with it—you've helped me split those reactions out into two different buckets. Seems relevant to common-knowledge-type stacks as well.

Comment by duncan_sabien on In My Culture · 2019-03-10T21:07:37.965Z · LW · GW

Strong appreciation for this comment/strong endorsement of the warnings it provides. However, I do nevertheless continue to think it's well-suited to important topics, having seen it productively used on important topics in my own experience.

Comment by duncan_sabien on In My Culture · 2019-03-10T00:00:14.705Z · LW · GW
it's being used in ways that manipulate, reject, oppress, suppress, override, and imply "rightness" over other people who might be unsuspecting to the method

This reads to me as an assertion in need of justification, via a more detailed description of the underlying causal model, or analogies to known phenomena, or illustrative case-study style examples, or something. Another way to say this is "I hear what you believe, but I do not hear why you believe what you believe."

Comment by duncan_sabien on In My Culture · 2019-03-09T23:53:55.563Z · LW · GW

I don't want to engage with most of the above, but one small note on my personal impression of the context culture of LW.

It seems to me that:

  • not demonstrating healthy agency
  • not delineating healthy from unhealthy agency
  • not couched in sensitivity

are each claims that themselves require some form of justification. i.e. sure, it makes sense that you would say "because of A, B, and C, I conclude this is bad," and I expect that most LWers would agree about the logic of that if-then.

But I also expect that most LWers would not find your three premises obviously true, and would therefore receive them as un- or underjustified assertions, and (given the local norms) expect you to include, from the beginning, more details of your underlying world model (and certainly expect you to be willing to expound upon them if asked, as I and Benito and Pattern have all asked).

(I note that in your response to Pattern you use a larger number of synonyms to repeat "X is bad" but don't actually explain why or how with e.g. claims about causality that can be investigated, or analogies to known phenomena whose aptness or inaptness can be discussed, or illustrative examples that others will find evocative, or anything like that.)

Comment by duncan_sabien on In My Culture · 2019-03-08T22:14:19.274Z · LW · GW


Comment by duncan_sabien on In My Culture · 2019-03-08T06:26:37.106Z · LW · GW

(I note for onlookers that I found this tool, as opposed to designing it myself, although my descriptions of it, and the way I made meaning out of my observations, are entirely self-generated and no one else's fault.)

Comment by duncan_sabien on In My Culture · 2019-03-07T23:15:32.284Z · LW · GW

My closest answer would be something like "in my version of utopia," although maybe that's too strong? Or perhaps (depending on how nerdy the group is) something like "if I were having this meeting with five clones of me..."?

Another clunkier version is just to port over the WHOLE concept, of not only personal culture but also context culture: "I mean, there's a sort of thing where we kind of have norms and customs about how to communicate, and maybe they're a little different from what any individual wants or would do, and that's good, I'm not trying to say the group norms should exactly match my individual preferences, but like in my own little one-person culture, X, and I imagine maybe some people here didn't know that."

Comment by duncan_sabien on In My Culture · 2019-03-07T22:25:04.740Z · LW · GW

I do think that "in my mind" and "my initial reaction" gets a lot of the value. I'm curious if you ever run into people who are uncertain whether you mean "Dagon is expressing a personal thought?" or "Dagon is making a bid to change our broader conversational API"?

For me, that was the biggest thing that I got, once my colleague started doing this—the distinction between their culture and their bids to change the norms.

Comment by duncan_sabien on In My Culture · 2019-03-07T22:22:05.048Z · LW · GW

In case it wasn't clear from the bit at the end: I am deeply interested in other people offering expressions of elements of their culture via writing comments here or on Medium or FB, to the extent that that feels like a fun or interesting or valuable thing to do.

Comment by duncan_sabien on In My Culture · 2019-03-07T19:24:44.551Z · LW · GW

I agree that "in my culture" works if and only if there is *also* a common-knowledge understanding that we're in the metaphorical diplomatic setting and that it's not a bid for changing that diplomatic setting's context culture. I also agree that a lack of intent to apply pressure doesn't always equate to a lack of perception of pressure.

I have a friend who advocates "in my religion" as the superior phrase for that reason—we already have clear common-knowledge boundaries around how religion is personal and sort of self-aware/known to be something other people won't pick up. I feel a little squidgy around that one myself, though, because it seems *too* self-deprecating in populations with a high percentage of atheism.

Comment by duncan_sabien on In My Culture · 2019-03-07T18:10:56.933Z · LW · GW

It seems important to me to distinguish between "designed to subtly oppress other people" and "will necessarily have the result of subtly oppressing other people."

It is not the former (and the implication that it is sounds to me like the motte version of a claim whose bailey is "Duncan is lying/manipulative," or like an assertion that you know better than I do what is going on inside my own head).

I am genuinely interested if you can shine light on the latter in a concrete, specific, and here-are-the-lines-of-cause-and-effect sort of way, since subtly oppressing other people is counter to my goals.