Posts

"Dark Constitution" for constraining some superintelligences 2024-01-10T16:02:36.706Z
Is being sexy for your homies? 2023-12-13T20:37:02.043Z
Sapient Algorithms 2023-07-17T16:30:01.350Z
What are brains? 2023-06-10T14:46:46.936Z
Slack matters more than any outcome 2022-12-31T20:11:02.287Z
Where's the economic incentive for wokism coming from? 2022-12-08T23:28:49.904Z
Here's the exit. 2022-11-21T18:07:23.607Z
What is an agent in reductionist materialism? 2022-08-13T15:39:39.013Z
We're already in AI takeoff 2022-03-08T23:09:06.733Z
Signaling isn't about signaling, it's about Goodhart 2022-01-06T18:49:48.534Z
What are sane reasons that Covid data is treated as reliable? 2022-01-01T18:47:38.919Z
Where can one learn deep intuitions about information theory? 2021-12-16T15:47:01.076Z
Creating a truly formidable Art 2021-10-14T04:39:16.641Z
Of Two Minds 2018-05-17T04:34:51.892Z
Noticing the Taste of Lotus 2018-04-27T20:05:23.898Z
Mythic Mode 2018-02-23T22:45:06.709Z
The Intelligent Social Web 2018-02-22T18:55:36.414Z
Kenshō 2018-01-20T00:12:01.879Z
CFAR 2017 Retrospective 2017-12-19T19:38:35.516Z
In praise of fake frameworks 2017-07-11T02:12:32.017Z
Gears in understanding 2017-05-12T00:36:17.086Z
The art of grieving well 2015-12-15T19:55:44.893Z
Proper posture for mental arts 2015-08-31T02:29:01.312Z
Looking for a likely cause of a mental phenomenon 2012-12-01T19:43:32.916Z

Comments

Comment by Valentine on The Dark Arts · 2024-01-10T16:13:59.624Z · LW · GW

I consider ultra-BS a primarily 'central route' argument, as the practitioner uses explicit reasoning to support explicit narrative arguments. […]

Putting someone off balance, on the other hand, is more 'peripheral route' persuasion. There's far more emphasis on the implicit messaging.

Ah! This distinction helped clarify a fair bit for me. Thank you!

 

…I think I might conclude that your implicit primers and vibes are very good at detecting implicit persuasion, which typically but not always has a correlation with dark artsy techniques.

I agree on all accounts here. I think I dumped most of my DADA skill points into implicit detection. And yes, the vibes thing isn't a perfect correlation to Dark stuff, I totally agree.

 

Is this example satisfying?

It's definitely helpful! The category still isn't crisp in my mind, but it's a lot clearer. Thank you!

 

Thanks for the response in any case, I really enjoy these discussions! Would you like to do a dialogue sometime? 

I've really enjoyed this exchange too. Thank you!

And sure, I'd be up for a dialogue sometime. I don't have a good intuition for what kind of thing goes well in dialogues yet, so maybe take the lead if & when you feel inspired to invite me into one?

Comment by Valentine on Here's the exit. · 2024-01-10T02:02:37.788Z · LW · GW

Can you spell this out a little more? Did Brent and LaSota employ baloney-disclaimers and uncertainty-signaling in order to bypass people's defenses?

I think Brent did something different from what I'm describing — a bit more like judo plus DOS attacks.

I'm not as familiar with LaSota's methods. I talked with them several times, but mostly before I learned to detect the level of psychological impact I'm talking about with any detail. Thinking back to those interactions, I remember it feeling like LaSota was confidently asserting moral and existential things that threatened to make me feel inadequate and immoral if I didn't go along with what they were saying and seek out the brain hemisphere hacking stuff they were talking about. And maybe even then I'd turn out to be innately "non-good".

(Implied here is a type of Dark hack I find most folk don't have good defenses against other than refusing to reason and blankly shutting down. It works absurdly well on people who believe they should do what they intellectually conclude makes sense to do.)

The thing I was referring to is something I personally stumbled across. IME rationalists on the whole are generally more likely to take in something said in a low-status way. It's like the usual analyze-and-scrutinize machinery kind of turns off.

One of the weirder examples is, just ending sentences as though they're questions? I'm guessing it's because ending each thing with confidence as a statement is a kind of powerful assertion. But, I mean, if the person talking is less confident then maybe what they're saying is pretty safe to consider?

(I'm demoing back & forth in that paragraph, in case that wasn't clear.)

I think LaSota might have been doing something like this too, but I'm not sure.

(As a maybe weird example: Notice how that last sentence is in fact caveated, but it's still confident. I'm quite sure this is my supposition. I'm sure I'm not sure of the implied conclusion. I feel solid in all of this. My impression is, this kind of solidity is a little (sometimes a lot) disturbing to many rationalists (with some exceptions I don't understand very well — like how Zvi and Eliezer can mostly get away with brazen confidence without much pushback). By my models, the content of the above sentence would have been easier to receive if rewritten along the lines of, "I'm really not sure, but based on my really shaky memories, I kinda wonder if LaSota might have been doing something like this too — but don't believe me too much!")

Does that answer what you'd hoped?

Comment by Valentine on The Dark Arts · 2023-12-31T23:49:30.022Z · LW · GW

Yep, I think you're basically right on all accounts. Maybe a little off with the atheist fellow, but because of context I didn't think to share until reading your analysis, and what you said is close enough!

It's funny, I'm pretty familiar with this level of analysis, but I still notice myself thinking a little differently about the bookstore guy in light of what you've said here. I know people do the unbalancing thing you're talking about. (Heck, I used to quite a lot! And probably still do in ways I haven't learned to notice. Charisma is a hell of a drug when you're chronically nervous!) But I didn't think to think of it in these terms. Now I'm reflecting on the incident and noticing "Oh, yeah, okay, I can pinpoint a bunch of tiny details when I think of it this way."

The fact that I couldn't tell whether any of these were "ultra-BS" is more the central point to me.

If I could trouble you to name it: Is there a more everyday kind of example of ultra-BS? Not in debate or politics?

Comment by Valentine on Here's the exit. · 2023-12-30T16:28:55.074Z · LW · GW

I'm gonna err on the side of noting disagreements and giving brief descriptions of my perspective rather than writing something I think has a good chance of successfully persuading you of my perspective, primarily so as to actually write a reply in a timely fashion.

Acknowledged.

 

I don't see this as showing that in all domains one must maintain high offensive capabilities in order to have good defenses.

Oh, uh, I didn't mean to imply that. I meant to say that rejecting attention to military power is a bad strategy for defense. A much, much better defensive strategy is to study offense. But that doesn't need to mean getting good at offense!

(Although I do think it means interacting with offense. Most martial arts fail spectacularly on this point for instance. Pragmatically speaking, you have to have practice actually defending yourself in order to get skillful at defense. And in cases like MMA, that does translate to getting skilled at attack! But that's incidental. I think you could design good self-defense training systems that have most people never practicing offense.)

 

I think these problems aren't that hard once you have community spaces that are willing to enforce boundaries. Over the last few years I've run many events and spaces, and often gotten references for people who want to enter the spaces, and definitely chosen to not invite people due to concerns about ethics and responsible behavior. I don't believe I would've accepted these two people into the spaces more than once or twice at most.

Nice. And I agree, boundaries like this can be great for a large range of things.

I don't think this helps the Art much though.

And it's hard to know how much your approach doesn't work.

I also wonder how much this lesson about boundaries arose because of the earlier Dark exploits. In which case it's actually, ironically, an example of exactly the kind of thing I'm talking about! Only with lessons learned much more painfully than I think was necessary due to their not being sought out.

But also, maybe this is good enough for what you care about. Again, I don't mean to pressure that you should do anything differently.

I'm mostly pushing back against the implication I read that "Nah, our patches are fine, we've got the Dark Arts distanced enough that they're not an issue." You literally can't know that.

 

My position is that most thinking isn't really about reality and isn't truth-tracking, but that if you are doing that thinking then a lot of important questions are surprisingly easy to answer.

Totally agree. And this is a major defense against a lot of the stuff that bamboozles most folk.

 

I think there's a ton of adversarial stuff going on as well, but the primary reason that people haven't noticed that AI is an x-risk isn't because people are specifically trying to trick them about the domain, but because the people are not really asking themselves the question and checking.

I agree — and I'm not sure why you felt this was relevant to say? I think maybe you thought I was saying something I wasn't trying to.

 

(I think there's some argument to be made here that the primary reason people don't think for themselves is because civilization is trying to make them go crazy, which is interesting, though I still think the solution is primarily "just make a space where you can actually think about the object level".)

This might be a crux between us. I'm not sure. But I think you might be seriously underestimating what's involved in that "just" part ("just make a space…"). Attention on the object-level is key, I 100% agree there. But what defines the space? What protects its boundaries? If culture wants to grab you by the epistemic throat, but you don't know how it tries to do so, and you just try to "make a space"… you're going to end up way more confident of the clarity of your thinking than is true.

 

I acknowledge that there are people who are very manipulative and adversarial in illegible ways that are hard to pin down. […] …I think probably there are good ways to help that info rise up and get shared…. I don't think it requires you yourself being very skilled at engaging with manipulative people.

I think there's maybe something of a communication impasse happening here. I agree with what you're saying here. I think it's probably good enough for most cases you're likely to care about, for some reasonable definition of "most". It also strikes me as obvious that (a) it's unlikely to cover all the cases you're likely to care about, and (b) the Art would be deeply enriched by learning how one would skillfully engage with manipulative people. I don't think everyone who wants to benefit from that enrichment needs to do that engagement, just like not everyone who wants to train in martial arts needs to get good at realistic self-defense.

I've said this several times, and you seem to keep objecting to my implied claim of not-that. I'm not sure what's going on there. Maybe I'm missing your point?

 

I do sometimes look at people who think they're at war a lot more than me, and they seem very paranoid and to spend so many cognitive cycles modeling ghosts and attacks that aren't there. It seems so tiring!

I agree. I think it's dumb.

 

I suspect you and I disagree about the extent to which we are at war with people epistemically.

Another potentially relevant point here is that I tend to see large groups and institutions as the primary forces deceiving me and tricking me, and much less so individuals.

Oh! I'm really glad you said this. I didn't realize we were miscommunicating about this point.

I totally agree. This is what I mean when I'm talking about agents. I'm using adversarial individuals mostly as case studies & training data. The thing I actually care about is the multipolar war going on with already-present unaligned superintelligences. Those are the Dark forces I want to know how to be immune to.

I'm awfully suspicious of someone's ability to navigate hostile psychofauna if literally their only defense against (say) a frame controller is "Sus, let's exclude them." You can't exclude Google or wokism or collective anxiety the same way.

Having experienced frame control clawing at my face, and feeling myself become immune without having to brace… and noticing how that skill generalized to some of the tactics that the psychofauna use…

…it just seems super obvious to me that this is really core DADA. Non-cognitive, very deep, very key.

 

  • Personally I would like to know two or three people who have successfully navigated being manipulated, and hopefully have them write up their accounts of that.

Ditto!

 

  • I think aspiring rationalists should maneuver themselves into an environment where they can think clearly and be productive and live well, and maintain that, and not try to learn to survive being manipulated without a clear and present threat that they think they have active reason to move toward rather than away from.

Totally agree with the first part. I think the whole thing is a fine choice. I notice my stance of "Epistemic warriors would still be super useful" is totally unmoved thus far though. (And I'm reminded of your caveat at the very beginning!)

I'm reminded of the John Adams quote: "I must study Politicks and War that my sons may have liberty to study Mathematicks and Philosophy.  My sons ought to study mathematics and Philosophy, Geography, natural History, naval Architecture, navigation, Commerce and Agriculature, in order to give their Children a right to study Painting, Poetry, Musick, Architecture, Statuary, Tapestry and Porcelaine."

 

I note that when I read your comment I'm not sure whether you're saying "this is an important area of improvement" or "this should be central to the art", which are very different epistemic states.

Oh, I don't know what should or shouldn't be central to the Art.

It just strikes me that rationality currently is in a similar state as aikido.

Aikido claims to be an effective form of self-defense. (Or at least it used to! Maybe it's been embarrassed out of saying that anymore?) It's a fine practice, it has immense value… it's just not what it says on the tin.

If it wanted to be what it claims, it needs to do things like add pressure testing. Realistic combat. Going into MMA tournaments and coming back with refinements to what it's doing.

And that could be done in a way that honors its spirit! It can add the constraints that are key to its philosophy, like "Protect everyone involved, including the attacker."

But maybe it doesn't care about that. Maybe it just wants to be a sport and discipline.

That's totally fine!

It does seem weird for it to continue claiming to be effective self-defense though. Like it needs its fake meaning to be something its practitioners believe in.

I think rationality is in a similar state. It has some really good stuff in it. Really good. It's a great domain.

But I just don't see it mattering for the power plays. I think rationalists don't understand power, the same way aikido practitioners don't understand fighting. And they seem to be in a similar epistemic state about it: they think they basically do, but they don't pressure-test their understanding to check, best as I can tell.

So of your two options, it's more like "important area for improvement"… roughly like pressure-testing could be an important area of improvement for aikido. It'd probably become a kind of central if it were integrated! But I don't know.

And, I think the current state of rationality is fine.

Just weak in one axis it sometimes claims to care about.

Comment by Valentine on Here's the exit. · 2023-12-29T20:42:29.487Z · LW · GW

Well, that particular comment had a lot of other stuff going on…

That's really not a central example of what I meant. I meant more like this one. Or this one.

 

But also, yeah, I do kinda feel like "downvoting people when they admit they did something bad" is a thing we sometimes do here and that's not great incentives. If someone wants to avoid that kind of downvote, "stop admitting to the bad thing" seems like an obvious strategy. Oops! And like, I remember times when I asked someone a question and they got downvoted for their answer, and I did think it was a bad answer that in a vacuum deserved downvotes, but I still upvoted as thanks for answering.

Yep. This is messy and unfortunate, I agree.

 

Someone might not have realized the thing they did was bad-according-to-LW, and the downvotes help signal that.

It's not possible to take the downvotes as a signal of this if downvotes get used for a wide range of things. If the same signal gets used for

"This was written in bad form, but if you'd written it differently it would have been welcome"

and

"Your attitude doesn't belong on this website, and you should change it or leave"

and

"I don't like your vibe, so I'm just gonna downvote"

then the feedback isn't precise enough to be helpful in shaping behavior.

 

If someone did a bad thing and doesn't care, maybe we just don't want them here.

True.

Although if the person disagrees with whether it was bad, and the answer to that disagreement is to try to silence them… then that seems to me like a pretty anti-epistemic norm. At least locally.

 

I'd also really like to see a return of the old LW cultural thing of, if you downvote then you explain why. There are some downvotes on my comments that I'm left scratching my head about and going "Okay, whatever." It's hard for downvotes to improve culture if the feedback amounts to "Bad."

I think there's currently too many things that deserve downvotes for that to be realistic.

I have a hard time believing this claim. It's not what I see when I look around.

The dynamic would be pretty simple:

  • After I downvote, I skim the replies to see if someone else already explained what had me do the downvote. If so, I upvote that explanation and agree-vote it too.
  • If there's no such explanation, I write one.

Easy peasy. I seriously doubt the number of things needing downvotes on this site is so utterly overwhelming that this approach is untenable. The feedback would be very rich, the culture well-defined and transparent.

I don't know why LW stopped doing this. Once upon a time it used to cost karma to downvote, so people took downvotes more seriously. I assume there was some careful thought put into changing that system to the current one. I haven't put more than a sum total of maybe ten minutes of thinking into this. So I'm probably missing something.

But without knowing what that something is, and without a lot of reason for me to invest a ton more time into figuring it out… my tentative but clear impression is that what I'm describing would be way better for culture here by a long shot.

Comment by Valentine on Here's the exit. · 2023-12-29T20:15:00.192Z · LW · GW

…I think another pretty good option is "a master rationalist would definitely avoid surrounding themselves with con artists and frauds and other adversarial actors".

I think that's a great option. I'd question a "master rationalist's" skills if they couldn't avoid such adversarial actors, or notice them if they slip through the cracks.

 

I do think there are real skills you are pointing to, but to some extent I prefer the world where I don't have those skills and in place of that my allies and I coordinate to identify and exclude people who are using the dark arts.

I like your preference. I'll say some things, but I want to start by emphasizing that I don't think you're making a wrong or bad choice.

I want to talk about what I think the Art could be, kind of for aesthetic reasons. This isn't to assert anything about what you or any given individual should or shouldn't be doing in any kind of moral sense.

So with that said, here are three points:

 

(1) I think there's a strong analogy here to studying combat and war. Yes, if you can be in a pacifist cluster and just exclude folk who are really into applied competitive strategy, then you have something kind of like a cooperate/cooperate equilibrium. But if that's the whole basis of your culture, it's extremely vulnerable, the way cooperate-bot is vulnerable in prisoners' dilemmas. You need military strength, the way a walled garden needs walls. Otherwise folk who have military strength can just come take your resources, even if you try to exclude them at first.

At the risk of using maybe an unfair example, I think what happened with FTX last year maybe illustrates the point.

Clearer examples in my mind are Ziz and Brent. The point not being "These people are bad!" But rather, these people were psychologically extremely potent and lots of folk in the community could neither (a) adequately navigate their impact (myself included!) nor (b) rally ejection/exclusion power until well after they'd already had their impact.

Maybe, you might hope, you can make the ejection/exclusion sensitivity refined enough to work earlier. But if you don't do that by studying the Dark Arts, and becoming intimately familiar with them, then what you get is a kind of naïve allergic response that Dark Artists can weaponize.

Again, I don't mean that you in particular or even rationalists in general need to address this. There's nothing wrong with a hobby. I'm saying that as an Art, it seems like rationality is seriously vulnerable if it doesn't include masterful familiarity with the Dark Arts. Kind of like, there's nothing wrong with practicing aikido as a sport, but you're not gonna get the results you hope for if you train in aikido for self-defense. That art is inadequate for that purpose and needs exposure to realistic combat to matter that way.

 

(2) …and I think that if the Art of Rationality were to include intimate familiarity with the Dark Arts, it would work way way better.

Things like the planning fallacy or confirmation bias are valuable to track. I could stand to improve my repertoire here for sure.

But the most potent forms of distorted thinking aren't about sorting out the logic. I think they look more like reaching deep down and finding ways to become immune to things like frame control.

Frame control is an amazing example in my mind precisely because of the hydra-like nature of the beast. How do you defend against frame control without breaking basic things about culture and communication and trust? How do you make it so your cultural and individual defenses don't themselves become the manual that frame controllers use to get their desired effects?

And this barely begins to touch on the kind of impact that I'd want to call "spiritual". By which I don't mean anything supernatural; I'm talking about the deep psychological stuff that (say) conversing with someone deep in a psilocybin trip can do to the tripper. That's not just frame control. That's something way deeper, like editing someone's basic personality operating system code. And sometimes it reaches deeper even than that. And it turns out, you don't need psychedelics to reach that deep; those chemical tools just open a door that you can open other ways, voluntarily or otherwise, sometimes just by having a conversation.

The standard rationalist defense I've noticed against this amounts to mental cramping. Demand everything go through cognition, and anything that seems to try to route around cognition gets a freakout/shutdown/"shame it into oblivion" kind of response. The stuff that disables this immune response is really epistemically strange — things like prefacing with "Here's a fake framework, it's all baloney, don't believe anything I'm saying." Or doing a bunch of embodied stuff to act low-status and unsure. A Dark Artist who wanted to deeply mess with this community wouldn't have to work very hard to do some serious damage before getting detected, best as I can tell (and as community history maybe illustrates).

If this community wanted to develop the Art to actually be skillful in these areas… well, it's hard to predict exactly what that'd create, but I'm pretty sure it'd be glorious. If I think of the Sequences as retooling skeptical materialism, I think we'd maybe see something like a retooling of the best of Buddhist psychotechnology. I think folk here might tend to underestimate how potent that could really be.

(…and I also think that it's maybe utterly critical for sorting out AI alignment. But while I think that's a very important point, it's not needed for my main message for this exchange.)

 

(3) It also seems relevant to me that "Dark Arts" is maybe something of a fake category. I'm not sure it even forms a coherent cluster.

Like, is being charismatic a Dark Art? It certainly can be! It can act as a temptation. It seems to be possible to cultivate charisma. But the issue isn't that charisma is a Dark Art. It's that charisma is mostly symmetric. So if someone has a few slightly anti-epistemic social strategies in them, and they're charismatic, this can have a net Dark effect that's even strategic. But this is a totally normal level of epistemic noise!

Or how about something simpler, like someone using confirmation bias in a way that benefits their beliefs? Astrology is mostly this. Is astrology a Dark Art? Is talking about astrology a Dark Art? It seems mostly just epistemically hazardous… but where's the line between that and Dark Arts?

How about more innocent things, like when someone is trying to understand systemic racism? Is confirmation bias a helpful pattern-recognizer, or a Dark Art? Maybe it's potentially in service to Dark Arts, but is a necessary risk to learn the patterns?

I think Vervaeke makes this point really well. The very things that allow us to notice relevance are precisely the things that allow us to be fooled. Rationality (and he explicitly cites this — even the Keith Stanovich stuff) is a literally incomputable practice of navigating both Type I and Type II errors in this balancing act between relevance realization and being fooled.

When I think of central examples of Dark Arts, I think mostly of agents who exploit this ambiguity in order to extract value from others.

…which brings me back to point (1), about this being more a matter of skill in war. The relevant issue isn't that there are "Dark Arts". It's that there are unaligned agents who are trying to strategically fool you. The skill isn't to detect a Dark toolset; it's to detect intelligent intent to deceive and extract value.

 

All of which is to say:

  • I think a mature Art of Rationality would most definitely include something like skillful navigation of manipulation.
  • I don't think every practitioner needs to master every aspect of a mature Art. Much like not all cooks need to know how to make a roux.
  • But an Art that has detection, exclusion, & avoidance as its only defense against Dark Artists is a much poorer & more vulnerable Art. IMO.
Comment by Valentine on The Dark Arts · 2023-12-29T18:33:50.270Z · LW · GW

The unspoken but implicit argument is that Russia doesn't need a reason to nuke us. If we give them the Arctic there's no question, we will get nuked.

Ah, interesting, I didn't read that assumption into it. I read it as "The power balance will have changed, which will make Russia's international bargaining position way stronger because now it has a credible threat against mainland USA."

I see the thing you're pointing out as implicit though. Like an appeal to raw animal fear.

 

For a successful nuclear first strike to be performed Russia must locate all of our military assets (plus likely that of our NATO allies as well), take them all out at once, all while the CIA somehow never gets wind of a plan.

That makes a lot of sense. I didn't know about the distributed and secret nature of our nuclear capabilities… but it's kind of obvious that that's how it'd be set up, now that you say so. Thank you for spelling this out.

 

Reactions like yours are thus part of what I was counting on when making the argument. It works because in general I can count on people not having prior knowledge. (don't worry, you're not alone)

Makes sense!

And I wasn't worried. I'm actually not concerned about sounding like (or being!) an idiot. I'm just me, and I have the questions I do! But thank you for the kindness in your note here.

 

It also seems rather incongruous with most people's model of the world […]. Suppose Russia was prepared to nuke the US, and had a credible first strike capability. Why isn't Uncle Sam rushing to defend his security interests? Why haven't pundits and politicians sounded the alarm? Why has there been no diplomatic incidents? A second Cuban missile crisis? A Russian nuclear attack somewhere else?

I gotta admit, my faith in the whole system is pretty low on axes like this. The collective response to Covid was idiotic. I could imagine the system doing some stupid things simply because it's too gummed up and geriatric to do better.

That's not my main guess about what's happening here. I honestly just didn't think through this level of thing when I first read your arctic argument from your debate. But collective ineptitude is plausible enough to me that the things you're pointing out here just don't land as damning.

But they definitely are points against. Thank you for pointing them out!

 

I hope that answers your question! Is everything clear now?

For this instance, yes!

There's some kind of generalization that hasn't happened for me yet. I'm not sure what to ask exactly. I think this whole topic (RE what you're saying about Dark Arts) is bumping into a weak spot in my mind that I wasn't aware was weak. I'll need to watch it & observe other examples & let it settle in.

But for this case: yes, much clearer!

Thank you for taking the time to spell all this out!

Comment by Valentine on The Dark Arts · 2023-12-29T18:20:19.635Z · LW · GW

Do you mind providing examples of what categories and indicators you use?

I can try to provide examples. The indicators might be too vague for the examples to help much with though!

A few weeks ago I met a fellow who seems to hail from old-guard atheism. Turn-of-the-century "Down with religion!" type of stuff. He was leading a philosophy discussion group I was checking out. At some point he said something (I don't remember what) that made me think he didn't understand what Vervaeke calls "the meaning crisis". So I brought it up. He started going into a kind of pressured debate mode that I intuitively recognized from back when I swam in activist atheism circles. I had a hard time pinning down the moves he was doing, but I could tell I felt a kind of pressure, like I was being socially & logically pulled into a boxing ring. I realized after a few beats that he must have interpreted what I was saying as an assertion that God (as he thought others thought of God) is real. I still don't know what rhetorical tricks he was doing, and I doubt any of them were conscious on his part, but I could tell that something screwy was going on because of the way interacting with him became tense and how others around us got uneasy and shifted how they were conversing. (Some wanted to engage & help the logic, some wanted to change the subject.)

Another example: Around a week ago I bumped into a strange character who runs a strange bookstore. A type of strange that I see as being common between Vassar and Ziz and Crowley, if that gives you a flavor. He was clearly on his way out the door, but as he headed out he directed some of his… attention-stuff… at me. I'm still not sure what exactly he was doing. On the surface it looked normal: he handed me a pamphlet with some of the info about their new brick-and-mortar store, along with their online store's details. But there was something he was doing that was obviously about… keeping me off-balance. I think it was a general social thing he does: I watched him do it with the young man who was clearly a friend to him and who was tending the store. A part of me was fascinated. But another part of me was throwing up alarm bells. It felt like some kind of unknown frame manipulation. I couldn't point at exactly how I was being affected, but I knew that I was, because my inner feet felt less firmly on inner ground in a way that was some kind of strategic.

More blatantly, the way that streetside preachers used to find a corner on college campuses and use a loudspeaker to spout off fundamentalist literalist Christianity memes. It's obvious to me now that the memetic strategy here isn't "You hear my ideas and then agree." It's somehow related to the way that it spurs debate. Back in my grad school days, I'd see clusters of undergrads surrounding these preachers and trying to argue with them, both sides engaging in predetermined patter. It was quite strange. I could feel the pull to argue with the preacher myself! But why? It has a snare trap feeling to it. I don't understand the exact mechanism. I might be able to come up with a just-so story. But looking back it's obvious that there's a being-sucked-in feeling that's somehow part of the memetic strategy. It's built into the rhetoric. So a first-line immune response is "Nope." Even though I have little idea what it is that I'm noping out of. Just its vibe.

I don't think all (any?) of these fall under what you're calling "ultra-BS". That's kind of my point: I think my rhetoric detector is tracking vibes more than techniques, and you're naming a technique category. Something like that.

I think this part stands alone, so I'll reply to the rest separately.

Comment by Valentine on The Dark Arts · 2023-12-25T16:01:00.377Z · LW · GW

Thank you. I found this exchange very enriching.

In particular, it highlights a gap in my way of reasoning. I notice that even after you give examples, the category of "ultra-BS" doesn't really gel for me. I think I use a more vague indicator for this, like emotional tone plus general caution when someone is trying to persuade me of something.

In the spirit of crisping up my understanding, I have a question:

Now, I understand I sound obviously crazy already, but hear me out. Russia's Kinzhal hypersonic missiles, which have a range of roughly 1,000 miles, cannot hit the US from the Russian mainland. But they can hit us from the Arctic. I add that hypersonic missles are very, very fast. [This essentially acts as a preemptive rebuttal to my opponent's counterargument (but what about MAD?).] If we're destroyed by a first strike, there is no MAD, and giving Russia the Arctic would immediately be an existential threat. 

Of course, this is ridiculous…

I think I'm missing something obvious, or I'm missing some information. Why is this clearly ridiculous?

Comment by Valentine on Kenshō · 2023-12-21T15:55:09.495Z · LW · GW

Isn’t this an ironic choice of metaphor? The situation rather more resembles you insisting that it’s your daughter’s arm, being certain of this despite many other people thinking that you’re not quite in touch with reality, being impervious to demonstrations or proofs that it’s your arm, etc.

Of course it's not ironic. What do you think the patient must think about the doctor's certainty?

Comment by Valentine on Here's the exit. · 2023-12-21T15:47:05.240Z · LW · GW

…the current site culture, moderation policies, etc., actively discourage such explanations.

How so? What's the discouragement? I could see people feeling like they don't want to bother, but you make it sound like there's some kind of punishment for doing so…?

Comment by Valentine on Here's the exit. · 2023-12-21T15:44:22.598Z · LW · GW

I'd also really like to see a return of the old LW cultural thing of, if you downvote then you explain why. There are some downvotes on my comments that I'm left scratching my head about and going "Okay, whatever." It's hard for downvotes to improve culture if the feedback amounts to "Bad."

For instance, my review has been pretty heavily downvoted. Why? I can think of several reasons. But the net effect is to convey that LW would rather not have seen such a review.

Now why would that be?

I notice that there's also a -16 on the agree/disagree voting, with just three votes. So I'm guessing that what I said seriously irked a few people who probably heavy-downvoted the karma too.

But if it's really a distributed will, it's curious. Do you really want me not to have shared more context? Not to have reflected on where I'm at with the post? Or is it that you want me to feel differently about the post than I do?

I guess I don't get to know!

It's worth remembering that karma downvoting has a technical function. Serious negative karma makes a comment invisible by default. A user who gets a lot of negative karma in a short period of time can't post comments for a while (I think?). A user who has low karma overall can't post articles (unless that's changed?).

So a karma downvote amounts to saying "Shut up."

And a strong-downvote amounts to saying "Shut the fuck up."

If that's really the only communication the whole culture encourages for downvotes… that doesn't really foster clarity.

It seems dead obvious to me that this aspect of conversation culture here is quite bad.

But this isn't a hill I intend to die on.

Comment by Valentine on Kenshō · 2023-12-20T03:51:54.743Z · LW · GW

…it looks like Valentine is never going to write the promised post…

It was Mythic Mode. I guess that went over everyone's heads.

I had a sequence in mind, on "ontology cracking". I gave up on that sequence when it became obvious that Less Wrong really wasn't interested in that direction at all. So I ended up never describing how I thought mythic mode worked on me, and how it might generalize.

But honestly, Mythic Mode has all the ingredients you need if you want to work it out.

It also seems worth noting, I've gotten way more PCK on the whole thing since then, and now I have approaches that are a fair bit more straightforward. More zen-like. Kinder. So the approach I advocate these days feels different and is more grounded & stable.

I might try to share some of that at some point.

 

This might be related to his statement in a followup discussion that he is unable to provide any cake. (It is an odd discussion, I think, and reading Valentine's attempts to comment there remind me of Kennaway's comment.)

You seem to have quite missed the point of that exchange.

But honestly, I'm tired of arguing with logic machines about this. No, I cannot prove to you that it's not your daughter's arm. No, that fact does not cause me to question my certainty that it's not your daughter's arm. Yes, I understand you think I'm crazy or deluded. I am sorry I don't know how to help you; it is beyond my skill, and my human heart hurts for being so misunderstood so much here.

Comment by Valentine on Here's the exit. · 2023-12-19T15:59:38.218Z · LW · GW

As an aside, looking over the way some of my comments were downvoted in the discussion section:

I think LW could stand to have a clearer culture around what karma downvotes are for.

Now that downvote is separable from disagreement vote, I read a downvote as "This comment shouldn't have been posted / doesn't belong on LW."

But it's clear that some of what I said was heavily downvoted because I took a stance people didn't like. Saying things like "Yep, I could have phrased this post in a more epistemically accurate way… but for this post in particular I really don't care."

Would you really rather I didn't share the fact that I didn't care?

I'm guessing the intention was to punish me for not caring.

…which is terrible collective rationality, by the way! It's an attempt to use social-emotional force to change how my mind works without dialoguing with the reasons I'm making the choices I am.

(Which is ironic given the nature of the complaints about this post in particular!)

I'd argue that the right and good function of downvoting is to signal an opinion that a post or comment does not belong here.

That's how I use it. And until I'm given good reason otherwise, that's how I plan to continue using it.

I'd also really like to see a return of the old LW cultural thing of, if you downvote then you explain why. There are some downvotes on my comments that I'm left scratching my head about and going "Okay, whatever." It's hard for downvotes to improve culture if the feedback amounts to "Bad."

(But this really is an aside. It doesn't matter at all for the 2022 review. It's not really about this particular post either. It just has some very loud-to-me examples of the downvote behavior I think is unhealthy.)

Comment by Valentine on Here's the exit. · 2023-12-19T15:32:09.381Z · LW · GW

It's kind of funny to me to see this one nominated. It's sort of peak "Val is weird on LW".

The point of this post wasn't to offer claims for people to examine. I still agree with the claims I see myself having tried to make! But the point wasn't to offer ideas for discussion. It was to light a path out of Hell.

Because of that purpose, the style of this post really doesn't fit LW culture. I think it's fair to call it a mind spell. I get the impression that LWers in particular find mind spells unnerving: they're a symmetric tech that can do an end-run around the parts of cognition that rationalists heavily rely on to feel safe. Hence tripping the "cult"/"guru" immune reaction.

(To me it's dead obvious that this highlights a gap in the LW rationality toolbox. The reaction of "Lock down, distrust, get cynical, burn it with fire" actually makes you more susceptible to skillful bad actors — like going rigid in response to a judo master grabbing a hold of you. IMO, a mature Art of Rationality would necessarily include learning to navigate cognition-jamming (or cognition-incompatible!) spaces with grace. But I get the sense LW collectively doesn't want to build that skillset. Which is fine, but I find it a bit disappointing.)

I picked up some of the language & framing of this post from Perri Chase. I now talk about this stuff a little differently. And more kindly, I think. I suspect I could write a version of this spell today that would be less of a problem for the LW memetic immune system. Partly because I'm better at slipping through immune systems! (I'm sure that's comforting!) But mostly because I've learned how to work with such systems instead of needing to step around them to have the "real" conversation.

That said, I don't regret writing this post. I got a lot of feedback (including in quite a few PMs across many different media) from people who found this relieving, validating, soothing, deeply helpful, kind, orienting. I'm okay with some people being upset with me if that's the price for enacting this kindness. I went in expecting that price, really.

I think there's a post possible that would be something like a LW-compatible rewrite of this one. It'd remove the "spell" nature and try to lay out some claims & implications for folk to consider. A bit like dissecting a once-living specimen and laying out its organs for examination.

I probably won't write that post. I don't see it doing hardly any good beyond being kind of interesting.

I might write a related post sometime on the nature of Hell as a psychosocial attractor state. AFAICT it's utterly essential study for real Defense Against the Dark Arts. It's also very tricky to talk about in a way that's kind to the listeners or the speaker. But if LW were to learn to take it seriously without falling into it harder, I think that awareness would transform a lot of what "rationality" means here, and it would soften a lot of the sharp edges that can meaningfully hurt people here.

I don't plan on rewriting any of this post for the review. The spell worked great. I want to leave it here as is.

(Though if someone understands the spellcraft and wants to suggest some edits, I'm open to receiving those suggestions! I'm not putting up a wall here. I'm just sharing where I'm at with this post right now, for the sake of the 2022 review.)

Comment by Valentine on Here's the exit. · 2023-12-19T14:28:13.376Z · LW · GW

I like the tone of this review. That might be because it scans as positive about something I wrote! :D But I think it's at least in part because it feels clear, even where it's gesturing at points of improvement or further work. I imagine I'd enjoy more reviews written in this style.

 

I would be interested to see research done to test the claim. Does increased sympathetic nervous system activation cause decreased efficacy [at AI research]?

If folk can find ways of isolating testable claims from this post and testing them, I'm totally for that project.

The claim you name isn't quite the right one though. I'm not saying that people being stressed will make them bad at AI research inherently. I'm saying that people being in delusion will make what they do at best irrelevant for solving the actual problem, on net. And that for structural reasons, one of the signs of delusion is having significant recurring sympathetic nervous system (SNS) activation in response to something that has nothing to do with immediate physical action.

The SNS part is easy to measure. Galvanic skin response, heart rate, blood pressure, pupil dilation… basically hooking them up to a lie detector. But you can just buy a GSR meter and mess with it.

I'm not at all sure how to address the questions of (a) identifying when something is unrelated to immediate physical action, especially given the daughter's arm phenomenon; or (b) whether someone's actions on net have a positive effect on solving the AI problem.

E.g., it now looks plausible that Eliezer's net effect was to accelerate AI timelines while scaring people. I'm not saying that is his net effect! But I'm noting that AFAIK we don't know it isn't.

I think it would be extremely valuable to have some way of measuring the overall direction of some AI effort, even in retrospect. Independent of this post!

But I've got nuthin'. Which is what I think everyone else has too.

I'd love for someone to prove me wrong here.

 

A sequence or book compiled from the wisdom of many LessWrongers discussing their mental health struggles and discoveries would be extremely valuable to the community (and to me, personally)…

This is a beautiful idea. At least to me.

Comment by Valentine on Here's the exit. · 2023-12-19T14:27:54.770Z · LW · GW
Comment by Valentine on Is being sexy for your homies? · 2023-12-16T16:57:18.852Z · LW · GW

That's the narrative for sure. I wonder if it's mostly just a stale holdover and doesn't really apply though.

Like, misandry is vastly more blatant and serious these days from what I can tell. Getting emotional or social support as a man is a joke. There's a whole totally weirdly okay joke set that basically goes "What are women better at than men? XYZ…. What are men better at than women? Stupid pointless stuff, being wrong, yada yada, hahaha!"

There's a ton of stuff like this, like with child custody & paternity, or suicide patterns… but all this gets shoved into an eyerolling box of "MRA" or whatever. So it's un-talk-about-able.

I wonder if men are actually way more restricted in what they can do these days than women are. I don't know. But it sure seems plausible to me!

So I question whether it's really an anti-women pressure. I suspect it's more like, there's gender warfare going on, and we seem to have figured out how to culturally attack one direction of it pretty well, but we haven't stopped the war.

And having the suggested solution be even more women's rights just… doesn't seem like it's looking at the real problem.

At least to me.

Comment by Valentine on Upgrading the AI Safety Community · 2023-12-16T16:42:57.107Z · LW · GW

Especially if people like @Valentine are called upon to return from their cold sleep because the world needs them.

Double-click? I'm wondering what you mean by "cold sleep" here.

Comment by Valentine on Is being sexy for your homies? · 2023-12-16T02:54:35.878Z · LW · GW

FWIW, I meant something less like "Pretend it doesn't matter to you personally, please don't feel emotional responses" and more like "There's zero intention of attacking something precious here, I hope you can feel that and can engage in a way that's not attack-and-defend; let's honor all the precious things together in our pursuit of truth."

Comment by Valentine on Is being sexy for your homies? · 2023-12-16T02:50:51.663Z · LW · GW

Yeah, this matches my experience.

Comment by Valentine on Is being sexy for your homies? · 2023-12-16T02:44:48.195Z · LW · GW

...reads like a mistake a feminist would not have made.

I guess maybe I'm not whatever you mean by "a feminist" then…?

I read you as meaning something a little like "You should have known better. You would have if you'd been the right kind of person. So you're the wrong kind of person."

I mean… okay? Sure? I guess you can believe that if you want?

But also… doesn't that make the conversation harder?

(And sorry if I'm misreading you here. I don't mean to trap you in a meaning you didn't intend if I'm missing you here. It just seems worth naming explicitly in case I am roughly catching your emotional tone right.)

 

(Implicit assumption: "postmodernism" was coined in 1980 and "postmodern feminism" the mid-90s, and most people who talk about gender ideology date it to the last 10-20 years, so I'm assuming that's the time period you're referring to by "last many years".)

I didn't mean anything formal. I was mostly reflecting on how #metoo seemed to imply women feeling pretty unsafe in workplaces and everywhere else for quite a while. And in the wake of #metoo guys feeling like their own sexuality was like Russian roulette.

So I guess I was gesturing at roughly the last decade or so.

I didn't mean to talk about {postmodern feminism} by the way. I meant "postmodern" and "feminist" as two separate adjectives.

 

I think separating the sexes into distinct classes ("kitchen staff are one sex and serving staff are another") wouldn't output a separate-but-equal situation; it would instead output a society that subjugates women overtly (again).

I'm really not sure. I don't think there's a fundamental "subjugate women" drive. If we were to implement this kind of segregation today, it'd have the benefit of a very different context.

That said, I do agree that "separate but equal" is a crazy myth. That doesn't make much sense. If they're equal, why separate them? The whole point is that they're not equal. Not in all ways. E.g., if we had jobs explicitly separated by gender, then obviously things involving lifting heavy things by hand (e.g. certain kinds of construction) should be male.

Part of the problem has long been that things traditionally coded male have also been more economically valued. We don't economically value raising an emotionally healthy child the way we value creating a million-dollar company.

If we don't change those incentive structures, then yeah, economic separation by sex might end up with some old unkindness.

Comment by Valentine on Is being sexy for your homies? · 2023-12-16T02:17:05.519Z · LW · GW

The bonobos apparently use sex to strengthen bonds, but your argument is about strengthening bonds through non-sex with your non-sexually-compatible friends, so idk how those are related

Ah yeah, oops, I noticed that possible confusion and forgot to say something about it.

The fact that the bonobos use sex to reassure each other is purely incidental to why it came to mind for me. The structure of interest was more "Our tribe just encountered a potentially rare resource, so let's focus on reaffirming our tribal bonds before we even orient to the resource."

Like for men, they could focus on just maximizing appeal to women… but that'd heat up competition between them. So maybe instead there's a draw to affirming male bonds. Being useful to other men. Working on being a more functional member of the male cluster.

Likewise for women. The main factor in picking a mate isn't getting a guy to want to have sex with her. It's in making sure she's well-supported while having children. If there's competition between the women for attracting a specific man, that can create rancor in their ranks, and that can weaken all their children's support. So there's maybe a natural draw to focus on bonding with other women first precisely because they're the competition. Slightly different dynamic as with the men, but roughly the same overall effect.

Comment by Valentine on Is being sexy for your homies? · 2023-12-14T20:53:01.636Z · LW · GW

You can't say I'm defecting after I'm below zero.

Uh… that's not how "defection" works.

Comment by Valentine on Is being sexy for your homies? · 2023-12-14T20:26:06.829Z · LW · GW

I feel like you have some implicit additional assumptions WRT what you mean by "stable", here.

True!

I think I meant mostly an intuition about how sexual stuff adds drama that isn't relevant to (say) baking.

 

I also have the intuition that a single-gender environment would be less stable in the sense of being somehow "more stale" and "less alive" than a mixed-gender one, and thus less stable in the long-term…

Huh. Well, I guess it depends a lot on the social scene!

Like, I don't think a football team would feel more alive if you mixed in girls. Even if you somehow navigate the thing about major physical differences between the sexes. There's something about the way the locker room culture there can be masculine that's actually part of the bonding. And I think having a girl or two mixed in there could add some rivalry that'd have to be sorted out to be a functional team!

But yeah, if it's a group of programmers, it might actually work better to have mixed sexes. Vague intuition here. Though I notice that I picked this example in part because physical sex can be made way, way less relevant in that context. (E.g., it's possible to have a team of programmers that don't even know one another's sex and interact purely remotely and over text. That's just not gonna work in a football team.)

All of what I'm saying here is spitballing and not very careful. Just playing with ideas. Thanks for pointing out the questions here!

Comment by Valentine on Is being sexy for your homies? · 2023-12-14T20:18:34.671Z · LW · GW

Uh… maybe?

Comment by Valentine on Is being sexy for your homies? · 2023-12-14T20:16:46.350Z · LW · GW

FWIW, my experience on this was… mixed.

My easiest time having female friends was in an implicitly monogamous context, when I was married, and my wife and I were exclusive. It was super easy. Like a switch in my brain could just filter out the attraction question. It's like it was as addressed for all women the way it's always addressed for all men.

It became way messier when she & I opened up our marriage. Then the sexual dynamic between me and her felt to me like it depended on whether I could find other female partners. I don't know if she really felt this way! But for me there was a real concern: When we were exclusive, other women not being into me was just expected. But when we were open, I feared other women not being into me was a sign she should focus on mating with other guys.

So there was a sense, for me, of increased pressure that I needed to find more partners even if my wife was the only woman I was interested in!

This increased stress on my female relationships.

Now, in an implicitly poly context, this isn't a huge problem. "Might we fuck?" is a lot more okay a question to explore.

But it became a question we had to explore, basically every time, at least on my end, at least implicitly.

I now find it's easier to have friendships with poly women now that I've set poly aside… because their being poly puts them out of the market for me.

And none of this is to dismiss your experience! I bet if I were more sexually confident, and happier being poly, I might feel the same way you do.

I'm just offering some counterpoint.

Comment by Valentine on Is being sexy for your homies? · 2023-12-14T20:05:15.767Z · LW · GW

I think the point is that women are clearly optimizing way harder for female approval of their looks than they are male approval.

This article is pretty wild to read in this context. I think it has some Hell Realm memetic code embedded in it, & LW is kind of awful at navigating Hell Realm memetics, so I kinda hesitate to point at it here… but with that caveat: it's just fascinating that here's an article spelling out how to maximally appeal to the male gaze, focused on some sincere attempts at data, and the apparent female reaction is disgust and eyerolling and attempts to censor?

(It's possible that the female reaction is actually to Hell Realm code, not so much to the optimize-for-male-gaze thing. I bet that's at least a factor. But it's still interesting that the rejection shows up this way!)

Comment by Valentine on Is being sexy for your homies? · 2023-12-14T20:01:04.228Z · LW · GW

Oh yeah, I read this article some time ago! It probably affected my thinking here.

I also heard Louise Perry make comments pointing out something similar recently.

I don't really get to claim a lot of originality here. Maybe my Great Insight™ is how there's maybe an analogy between the way women focus on beauty and men focusing on getting big.

Comment by Valentine on Is being sexy for your homies? · 2023-12-14T19:56:19.142Z · LW · GW

That's an interesting point. Thank you, I hadn't thought about it before. I'm not sure what it implies but it's nice to have noticed.

Comment by Valentine on Is being sexy for your homies? · 2023-12-14T16:53:45.081Z · LW · GW

Ditto. Gears, I didn't downvote your comments until you deleted them. It's now hard to see why I wrote what I did. I think that's bad form.

That said, I read you (Gears) as being overwhelmed here. I'm guessing you wanted to delete your comments because you're both hurting and feeling unseen/unsupported. Pulling out, including deleting your comments, totally makes sense to me in that context I'm imagining you in.

In the future, if you have to do that, I think it would be kinder to make some kind of note about that in the deleted comment.

Even better would be to use strikethrough with a comment saying something like "This is beyond my capacity to keep engaging in, I'll leave this here so others can understand the exchange, but I'm checking out and ask not to be pulled in or expected to reply."

But I also recognize this is an intense and painful topic for you. I understand if all of those options are out of emotional range for you.

But in case they're not out of range in the future, that'd at least have avoided my and Said's downvotes!

Comment by Valentine on Is being sexy for your homies? · 2023-12-13T21:33:07.784Z · LW · GW

Hmm. I'm guessing we're talking about slightly different parts of culture here, in a spot that's highly sensitive to you. I don't know if we're going to sort through this. But I'll try a little.

I don't know much about indigenous cultures with gender transitions. You have way more incentive to read up on that than I have. So you probably know about way more cases than I do.

However, I'm quite sure, on evolutionary grounds, that biological sex still has to be a major factor in those cultures. Not just what sex people identified with. The actual biological question of which pairs can produce babies when they have sex, and of those pairs which one can get pregnant.

This point gets warped a lot because the obvious reality of biological sex can be used as an attack on trans folks' needs.

That sucks. I wish that didn't happen.

But also, there's still a biological reality. Even if there are exceptions like XXY or whatever. Humans exist because males mated with females, getting the latter pregnant. That's obviously the main procreative force here!

What I'm saying is "extremely bizarre" is how today there's a push to pretend that this biological question doesn't matter. Or that it's incidental. That what physical sex someone is doesn't play a role in how they fit into the social web. That we can just mix people together in an office without regard for sex, and patch with some formal rules about "sexual harassment" or whatever, and call it good.

As far as I know, that is extremely weird. It's not something human cultures generally do or have done, to the best of my knowledge. Maybe occasionally for specific individuals, like "Joe is now Joanne." But not as part of the social fabric.

None of this is meant to deny your claim to being a woman.

Although it would be to deny a claim that you're a biological female. I'd say there's a meaningful biological difference between you and a cis woman — which is why there's a need to distinguish between "cis" and "trans".

I'm sorry if that scans as offensive. I don't mean it that way.

(But it is true. I'll stand by it even if it's offensive.)

What I'm hearing you say is, many human cultures used to view transitioning with more acceptance than we do today. That seems totally plausible to me.

I'm just guessing that even in those cultures, they didn't try to just assert that men and women are interchangeable and should be treated identically. Transitioning between genders probably meant transitioning social roles. That the men do some things and the women do other things, and you're switching which kinds of gendered things you're doing and with whom.

Whereas today, you can transition while working in some corporate office, and just… change nothing. Your appearance, how you want people to refer to you, sure. But what you do? And with whom? Your role in the social web? There's a weird game of pretend here where we're all kind of supposed to act like it doesn't matter.

That's what I'm saying is historically extremely bizarre. As far as I know.

Comment by Valentine on Is being sexy for your homies? · 2023-12-13T20:59:36.464Z · LW · GW

What did I say that gave you that impression?

[Edited. Originally I asked "Where did I say you're not a woman?" But I'm not looking for litigation here. I in fact don't mean to cause pain here. So I'm wondering what you tripped over in what I wrote.]

Comment by Valentine on AGI Alignment is Absurd · 2023-11-29T21:08:16.608Z · LW · GW

I'm not sure why all the people who think harder than I do about the field aren't testing their "how to get alignment" theories on humans first…

Some of us are!

I mean, I don't know you, so I don't know if I've thought harder about the field than you have.

But FWIW, there's a lot of us chewing on exactly this, and running experiments of various sizes, and we have some tentative conclusions.

It just tends to drift away from LW in social flavor. A lot of this stuff you'll find in places LW-type folk tend to label "post-rationalist".

Comment by Valentine on Social Dark Matter · 2023-11-17T19:37:09.580Z · LW · GW

While your point is technically true, it's not relevant here. Bezzi's point stands even if we just talk about trans folk whom most people can readily tell are trans.

Comment by Valentine on Saying the quiet part out loud: trading off x-risk for personal immortality · 2023-11-13T23:32:22.185Z · LW · GW

…I think most decent people would be willing to sacrifice their own life to prevent their civilization from going extinct, and I think it would be a very honorable thing to do.

While I agree, I want to take the opportunity to poke at something I often see in models like this one.

I think if you ask most people about this choice, they'd answer like you predict.

I think if you gave people a choice of buttons to push, one of which is this self-sacrifice button and the other being… uh… whatever the alternative is, there'd be some more hesitance but maybe not a ton.

But I suspect most people do not in fact understand what it means to die, or what they're agreeing to when they agree to sacrifice themselves for something.

I think this is precisely what changes when someone gets a terminal diagnosis. They come to understand in a lived way what it means that they, right there, inside that body and seeing from behind those eyes, are going to experience death. Their mortality isn't an abstraction for them anymore. They stop thinking "I'm going to die someday" like they're talking about a video came character and instead get that it means something way way deeper.

If you adjust the situation so that the person believes the argument that they need to die for the sake of their civilization, and then you hand them a gun with which to shoot themselves…

…I think you'd find the capacity to be "honorable" here dropping dramatically.

But not people saying they would shoot themselves in this hypothetical scenario! Because for the person thinking the thought experiment, the thought experiment doesn't embed the thought-experimenter.

I don't think this bears on the discussion of immortality vs. AI risk. The actions here are abstract enough to be more like the button-pushing case.

I just keep seeing this question of embedded agency getting skipped over and I think it's deeply important.

Comment by Valentine on The Gods of Straight Lines · 2023-10-16T13:45:28.175Z · LW · GW

…a world where individual actions don't matter that much should be a predictable world. And ours very much isn't.

Can you say more? My first reaction is "Huh, I didn't think of that, that's interesting." My second thought is "Wait, what about turbulence the weather?"

RE the latter: the action of individual air molecules doesn't really matter that much, but the net effect is still very hard to predict with much precision. We can say some things about the overall general net effect, but we miss a lot of important details.

(I'm thinking of how my family and I had a plane flight scheduled to Florida last year, and surprise! there was a hurricane that might or might not be about to hit Orlando at the time. The predictions weren't clear even a day before the event. It wasn't clear whether we'd be able to go at all, or when.)

Am I missing your meaning?

Comment by Valentine on What is ontology? · 2023-08-02T03:35:29.190Z · LW · GW

I also agree. I was going to write a similar answer. I'll just add my nuance as a comment to Zach's answer.

I said a bunch about ontologies in my post on fake frameworks. There I give examples and I define reductionism in terms of comparing ontologies. The upshot is what I read Zach emphasizing here: an ontology is a collection of things you consider "real" together with some rules for how to combine them into a coherent thingie (a map, though it often won't feel on the inside like a map).

Maybe the purest example type is an axiomatic system. The undefined terms are ontological primitives, and the axioms are the rules for combining them. We usually combine an axiomatic system with a model to create a sense of being in a space. The classic example of this sort being Euclidean geometry.

But in practice most folk use much more fuzzy and informal ontologies, and often switch between seemingly incompatible ones as needed. Your paycheck, the government, cancer, and a sandwich are all "real" in lots of folks' worldview, but they don't always clearly relate the kinds of "real" because how they relate doesn't usually matter.

I think ontologies are closely related to frames. I wonder if frames are just a special kind of ontology, or maybe the term we give for a particular use of ontologies. Mentioning this in case frames feel more intuitive than ontologies do.

Comment by Valentine on Jonathan Claybrough's Shortform · 2023-08-01T00:58:31.854Z · LW · GW

I really don’t know what you mean by any of this (especially the “anymore” part, but really all of it).

(I don’t think that it’s necessary to “orient to my tone”? In any case, generally speaking, if you assume that I mean just what I say, you won’t go far wrong.)

This is actually really clarifying. Thank you.

I now suspect there's a dimension of communication that's hyper-salient for me but invisible to you.

I won't try to convey that maybe invisible-to-you dimension here. I don't think that'd be helpful.

Instead I'll try to assume you have no idea what you're "saying" on that frequency. Basically that you probably don't mean things they way they implicitly land for me, and that you almost certainly don't consciously hold the tone I read in what you're saying.

That's as close as I can get to assuming that you "mean just what [you] say". Hopefully that'll smooth things out between us!

 

(ETA: I certainly don’t think that your question was absurd. If I did, I’d’ve said so, and not spent effort answering it!)

Okay, cool. Thanks for saying this!

 

> In case it’s not clear, the reason I asked you about the “falling in love” thing was to better understand what kind of thing “cake” as you mean it might even look like.

Indeed, that is also why I asked my counter-question; both to explain, and to understand.

I have to admit, I find this very confusing. I'm trying to understand what you mean by "cake". Maybe you were hoping to go "Here's 'cake' for falling in love. Now you try on this other topic, so I can thumbs-up or thumbs-down that you've understood what I mean by 'cake'." Is that it?

The thing is, I think I could provide a similar analysis, but I don't think it'd help me understand at all what you mean by "cake". That makes me pretty hesitant to spend the time and cognitive effort on producing that kind of matching analysis.

Comment by Valentine on Jonathan Claybrough's Shortform · 2023-08-01T00:34:53.825Z · LW · GW

Okay! Great, thank you.

This confirms I'm very thoroughly confused about what "cake" means to you here!

I thought you were looking for tangible proof of benefits, or something you could concretely try, or something like that. But now I know I have no idea what you're looking for!

I'll give examples to highlight my confusion. In your "cake" for falling in love, you say:

I could say that falling in love is worthwhile for its own sake. Of course, there isn’t any way I could convince you of that, but that’s not unusual; the same applies to the experience of eating ice cream, etc. This boils down to “try it; you’ll like it!”.

I seem to recall saying very similar things about kenshō. That there's something of deep importance, that this "insight" amounts to acknowledging it, that this is something you'd be super grateful for if it were to happen for you, and that there's not really much of a way for me to convince you of any of this. It's just a "Take a look and see for yourself" kind of thing.

That doesn't seem to have satisfied you. You still asked for "cake".

In particular, what you say here sounds to me like what I'd guess “assurances of having cake” would be.

In your second paragraph of "cake" you say:

falling in love is just the first part of a process which (summed across all instances over a person’s lifetime) is likely to account for a significant chunk of the happiness, life satisfaction, joy, pleasure, etc., that one experiences in one’s lifetime.

This lands for me as what I'd guess “allusions to kinds of cake” would be.

I could say something very similar about kenshō. I suspect I did in that monster thread five years ago. That if & when this flash of clarity comes online, there'll be a sense of something like "Oh holy fuck, I've been just living on autopilot! I haven't been alive here! I've been ignoring what actually goddamned matters just to tell myself some stories and live in fantasy! Whoa!!!" And it's very much just a beginning.

(There's a quote that goes something like "You have two lives. Your second life begins when you realize you have only one life." Kenshō is about beginning your second life.)

But again, this doesn't seem to have satisfied your need for "cake".

Your third paragraph includes:

those of my friends who’ve fallen in love, and entered into (and stayed in) long-term term relationships on the basis of that love, seem to be much happier than they were previously, and in particular, seem to make one another happy, in ways observable in ordinary, everyday interactions. (Of course one can have experiences and observe things that point to the opposite conclusion—but, again, this is too well-trod a topic to productively re-tread here.)

So, on this regarding kenshō I've been maybe too vague. Very attuned to the ability of folk to point at evidence of the opposite conclusion.

But if I can make a similar caveat as you've made here, I think I can point pretty clearly at this.

The people I know who are on the other side of this are alive. Engaged. Interesting. They're themselves much more deeply. More interested in really playing the game of life. Less willing to tolerate bullshit, especially in their for-fun social interactions.

They also almost all have war stories involving the collapse after awakening. A lot of lies people live can't work once they admit to themselves that they're lies. And it's hard not to admit stuff like that in the midst of an explosion of Light, at least in my experience. Breakups, financial collapse, and physical illness are not uncommon. It's usually temporary and most of them say that they totally wouldn't have it any other way — at least once they're through the other side. Some do get stuck. And there's a potential survivorship bias here in my account.

It'd be weird to call that a benefit. It's more like an attribute I notice over and over again. I totally had that. Arguably I still do: it feels like a deep existential allergy to all lies and bullshit turned on deep in my core, and now I'm on a lifelong journey for total and ever-perfecting alignment with… well, truth.

But as things go in terms of "What can we see this actually doing in the world?", here are a few attributes. The Dark Night stuff can be awful to go through, but it's like the vomiting part of food poisoning. It's not like the point of that is to be enjoyable, but you still want to have done it.

Now, if my saying all that still doesn't count for you as "cake"… then I have no idea how to proceed. You're going to have to define what you're looking for differently if you want me to have any chance of answering you on this point.

Comment by Valentine on Jonathan Claybrough's Shortform · 2023-07-31T23:54:49.665Z · LW · GW

Can you give some examples of things that those places have held?

Sure. I'll give just one for now. These take a while to name in writing, at least they way they occur to me.

Here's a recurring one: I'll be talking with someone in a coaching session, and I'll pick up on how they're "adding extra".

This is something that's easy to point out in live conversation or over video but I find tricky in writing. It's a tone thing. If I look at the cup next to me and note "This is a cup", I'm simply noting. There's nothing extra. But I can add extra with an emotional tone of "I keep this cup next to me to hydrate myself, because I take care of myself, which is something GOOD I do." I think this is easy to hear with some practice even if the words are identical ("This is a cup").

Usually when someone is adding extra, they have some unrecognized pain. Most of the time this pain roots down in a universal thing — something I've come to call "the pain of duality". It's a basic split from something core. It's roughly the same in everyone best as I can tell, but each person sort of holds and experiences it in their own way.

So when I see someone adding extra, and they've asked for my guidance, I'll sometimes guide them to awareness of this core pain:

  • "Here it seems to me that you're saying XYZ [like "I need to finish my thesis"], but you're also adding something extra. It occurs to me as a tone of ABC [usually like there's something wrong with them, or that their value is based on something external, or that something is existentially wrong]. There's nothing wrong with that. I just imagine it's uncomfortable. Do you see what I'm pointing at?"
    • We do some calibration, and I adjust based on their feedback (like if I was missing them in some key way). Then if I still see this core pain in them and they agree I'm seeing them clearly, on to the next step:
  • "Okay. Now, for me, in this spot I tend to feel PQR [something like "afraid something will go wrong if I don't take care of this task"]. But that's how the energy feels when it hits my thinking. Underneath that is something wordless. More like a creeping feeling, like reality itself is unsafe or unreliable."
    • The point here is to give an example of what it means to feel the energy behind something in consciousness. If my first example doesn't click for them, I'll give a few others.

Usually they either notice the core pain or adjust my perception of them. More often the former. It tends to result in a direct kind of seeing, the same way you can directly "see" the feeling of your tongue in your mouth: it's somehow more unmediated than thoughts about the thing are.

When I wrote Kenshō I might have called this "Looking at your soul pain". It's about seeing more directly instead of just thinking about mental models of the thing.

I just think that reifying Looking does something odd to this process. It's just noticing what you experience when you look where someone is pointing. Even if that someone is yourself.

Hopefully that's somewhat clear. With more time & effort I might have come up with a simpler example. ("Sorry this letter is so long, I didn't have time to write a shorter one.")

Comment by Valentine on Jonathan Claybrough's Shortform · 2023-07-27T02:09:27.579Z · LW · GW

Now, all of that having been said, here’s a counter-question, before we get to kenshō: can you provide an analogous sort of answer, for “having a paranoid delusion” in place of “falling in love”? Does it exist? Does it have any value?

How on Earth is this relevant? I'm really not following you here. What do you hope to gain by having me try to grapple with this weird thing?

Maybe you're trying to… I don't know, get even with me for asking something you find absurd? Trying to defeat me in some kind of dual where you think I issued the first challenge? I really don't know.

In case it's not clear, the reason I asked you about the "falling in love" thing was to better understand what kind of thing "cake" as you mean it might even look like. It really does land as a type error to me. But if you could say "Oh, for falling in love, thus-and-such would be 'cake'", then I could go "Oh! Okay. Cool! So I think the analogy for kenshō might be XYZ. Does that work for you?" Then we could communicate.

The feel — and I could easily be misreading you here — but the feel I get from you here is like intellectual one-upmanship. Mental fencing.

If that's the case, please understand that I'm just not available for that. I will not engage with you at that level anymore.

If I've misread you, then please clarify what you're doing. I don't know how to orient to your tone here. If you meant it collaboratively, then please help me see how. I'd very much like to.

Comment by Valentine on Jonathan Claybrough's Shortform · 2023-07-27T01:47:10.585Z · LW · GW

What part of this do you consider to be having "given me cake"?

Comment by Valentine on Jonathan Claybrough's Shortform · 2023-07-26T23:26:12.551Z · LW · GW

I don't think so. Not in terms that would satisfy you, best as I can tell.

Although… I wonder if we can translate a bit. If you were trying to convey this whole "falling in love" thing to me, while I'm suspicious about whether it exists or that if it exists that it has any value, and I were pressing you for "cake" about "falling in love", what would you offer?

I mean that sincerely. Those two feel like similar type errors to me. If you can offer a few examples of "cake" for falling in love then I might be able to figure out how to offer you "cake" for kenshō.

(I'm not too particular about "falling in love" per se. It's just the most fitting example that popped into mind.)

Comment by Valentine on Jonathan Claybrough's Shortform · 2023-07-26T22:18:35.182Z · LW · GW

do you have an example of: "getting better at attending to what matters to you, instead of attending to what you think matters to you, when there's a difference".

Would a central example be someone focusing on trying to become popular, not realising that it is only instrumental towards feeling positive about themselves?

I wouldn't say that's a central example. But I think it's a good one. Simple, clear, easy to access.

A much bigger and more central example to me is death. Most people don't seem to have clearly seen their own mortality. They're thinking about their own deaths like a Cartesian agent (i.e., not embedded — like the Alexei robot in this post). They know death happens to everyone eventually, and that it'll happen to "them", but it's almost like they're talking about a video game character from the outside. Like "Yep, at some point my little Mario figure there will stop when we turn off the gaming console."

Something really big shifts when folk get a terminal diagnosis. It's not just "Oh, I thought I had years, but now I have months/weeks/etc." There's a grappling-with that's hard to convey to non-terminal folk. "The end" becomes subjectively real. Like "Oh shit, I'm going to experience this from behind my eyes, inside this skin." It can feel very lonely and alienating, both because of the existential horror of the situation, but also because it becomes super clear how others are running strange scripts that just don't make sense.

I include most immortalists and transhumanists in this. I'm not talking just about "Are you taking practical actions to deal with your mortality?" I mean that there's a deep existential thing to grapple with. Even true immortals would have to orient to it: If you never ever die, then that means you're facing eternity. That's like something from behind your eyes.

(This is something even the literalist Christian mythos doesn't properly address AFAICT: Great, you go to Heaven. Then what? Do you feel time passing? What's that like? Goes on literally forever? Or does it… fade away at some point? Or do you leave time when you enter Heaven — in which case what's the subjective experience of that? This isn't just abstract philosophy. It's a damn meaningful question!)

It turns out that most things don't matter in the face of this. The whole thing with deathbed regrets amounts to "I'm sorry I didn't see this sooner and take it seriously while I still had time."

But it's not something you're likely to really get just by listening to elders and taking their advice. "Work less, connect with family more." That's good, and you'll be grateful for that in the end! But you won't understand why until you've really truly seen your own death clearly.

To the degree you do, it just won't be tempting anymore to work instead of connect (for instance).

But that's a really huge example. Maybe one of the biggest. Maybe maybe the biggest.

I do think the "Oh, I don't actually care about being popular" thing you suggest is pretty good. It has the right quality of "Now that I've seen it, I can't un-see it, and I've been seriously wasting my time on this."

I just wanted to flesh this out a bit, to hint at the scope.

Hopefully that made some sense.

Comment by Valentine on Jonathan Claybrough's Shortform · 2023-07-26T21:58:30.041Z · LW · GW

Why don't you care for the "looking" framing anymore?

I'm honestly not sure. Something sits wrong about it. Some initial stabs at the intuition are:

  • It feels too binary. Like you're either Looking or you're not. That doesn't seem right.
  • I think it might be several skills/senses folded into one. Or maybe something like a sense generator. (Lots of folk have trouble with conscious interoception, and that can be changed over time via a move I might have called "Looking" before.)
  • In practice, the relevant thing to develop seems to be something like capacity to receive the information rather than forcing oneself to seek it. I think intentionally seeing something beyond the usual ontology ("Looking") emerges naturally from being both willing and capable, once you know what direction to point your attention in.

But the overall feel is something like… the whole thing makes it sound pompous. Like "Oho! Here's this special magical skill!"

I mean, it is. It's quite potent. I'll sometimes mythically refer to this as "Mage Sight", or break it down into subtypes of Sight (like Spirit Sight or Prime Sight).

But I think something screwy happens when there's a social tone along the lines of "I see something you don't! 'Cause I'm bothering to look at all!"

Even when there's some truth to that, it's unkind. It doesn't honor the reason why the person hasn't pointed their attention there yet.

Also, in many cases I can just say "You want to see? Okay. Look here, then here, then here." And something clicks for my listener. Sometimes.

So, no reason to make the act of looking there something special. It's just perception.

Comment by Valentine on Jonathan Claybrough's Shortform · 2023-07-26T15:27:07.145Z · LW · GW

I don't know how to answer the general query. But I can say something maybe helpful about that Kenshō post and "Looking":

The insight was too new. I wrote the post just 4 months after the insight. I think I could answer questions like this way, way more clearly today.

(…although my experience with Said in particular has always been very challenging. I don't know that I could help him any better today than I could in 2018. Maybe? He seems to use a mind type that I've never found a bridge for.)

The issue is that the skill needed to convey an insight or skill is often significantly different from the insight/skill itself. And in many cases, getting quite good at something can make you worse at teaching it because you can come to forget what it's like to be a beginner.

IME it's necessary to reconstruct (literally "re-member") various beginners' minds after gaining expertise. Otherwise you can't meet beginners where they are. This is the essence of PCK.

I hadn't done basically any of that when I wrote Kenshō.

So I don't think the issue is that meditation or whatever borked my ability to talk clearly. It's more that I had a massive shift and then tried to talk about it right away. This challenge is a pretty general one IMO.

I suspect the issue is that the kind of massive insight that can happen from meditation, psychedelics, & related experiences can result in pretty huge shifts, making this "I'm talking gibberish for a while" thing way more common and enduring than with (say) mathematical insights.

A simple example with Kenshō:

One benefit of learning to Look (though I don't care for this framing anymore) is getting better at attending to what matters to you, instead of attending to what you think matters to you, when there's a difference.

Naturally many mind designs will think there's no difference, or that if there is one it's pretty small. But it's often — I daresay usually — massively larger than we tend to notice. The shift can feel a bit like suddenly remembering that you have a beloved child, and realizing you've been neglecting them and even forgetting they exist because you were distracted. It's not pleasant to realize! It might make you "less functional" according to your earlier standards. But you absolutely would not want to reverse the revelation!

One challenge I was grappling with in 2018 is that many minds will insist that (a) there's no reason to trust that such an insight is valid as opposed to being a delusion and (b) even if it were valid, that maybe it's better not to have the revelation in the first place since it borks metrics you can tell are useful.

In 2018 I didn't know how to orient to that while holding my clarity. All I could do is flail my arms and say "Epistemic puzzle!" Very, very poor PCK.

I wouldn't say my PCK is pristine now. But I could have a much more clear and kind conversation about it today than I could five years ago. It just took a while to develop.

Comment by Valentine on Rationality !== Winning · 2023-07-25T14:49:16.621Z · LW · GW

By the way, I think you should consider rewriting the side note re autistic nerd. I am still a bit confused reading that.

FWIW, I found the comment crystal clear.

CFAR's very first workshops had a section on fashion. LukeProg gave a presentation on why fashion was worth caring about, and then folk were taken to go shopping for upgrades to their wardrobe. Part of the point was to create a visible & tangible upgrade in "awesomeness".

At some point — maybe in those first workshops, I don't quite recall — there was a lot of focus on practicing rejection therapy. Folk were taken out to a place with strangers and given the task of getting rejected for something. This later morphed into Comfort Zone Expansion (CoZE) and, finally, into Comfort Zone Exploration. The point here was to help folk cultivate courage.

By the June 2012 workshop I'd introduced Againstness, which amounted to my martial arts derived reinvention of applied polyvagal theory. Part of my intent at the time was to help people get more into their bodies and to notice that yes, your physiological responses actually very much do matter for your thinking.

Each of these interventions, and many many others, were aimed specifically at helping fill in the autistic blindspots that we kept seeing with people in the social scene of rationalists. We weren't particular about supporting people with autism per se. It was just clear that autistic traits tended to synergize in the community, and that this led to points of systematic incompetence that mattered for thinking about stuff like AI. Things on par with not noticing how "In theory, theory and practice are the same" is a joke.

CFAR was responsible for quite a lot of people moving to the Bay Area. And by around 2016 it was perfectly normal for folk to show up at a CFAR workshop not having read the Sequences. HPMOR was more common — and at the time HPMOR encouraged people toward CFAR more than the Sequences IIRC.

So I think the "smart person self-help" tone ended up defining a lot of rationalist culture at least for Berkeley/SF/etc.

…which in turn I think kind of gave the impression that rationality is smart person self-help.

I think we did meaningfully help a lot of people this way. I got a lot of private feedback on Againstness, for instance, from participants months later saying that it had changed their lives (turning around depression, resolving burnout, etc.). Rejection therapy was a game-changer for some folk. I think these things were mostly net good.

But I'm with Raemon on this: For good rationality, it's super important to move past that paradigm to something deeper. Living a better life is great. But lots of stuff can do that. Not as many places have the vision of rationality.

Comment by Valentine on Cryonics and Regret · 2023-07-25T05:29:48.751Z · LW · GW

My heart aches to read this. I wish you freedom from self-blame. But I do understand.

I grew up in cryonics. My parents signed me up when I was a child. The ache of "If only I'd brought this up sooner, better, the right way" stayed with me a very long time. For every friend and relative who shrugged these things off. Especially for the handful already in graves now. So, so young.

I used to frequently imagine how, some decades or centuries from now, I'd be standing on a colony of the Moon looking up at our ancient cradle, the Earth. Standing there with the friends & family who'd made it. And younger folk too who love us and whom we love, hearing our stories of Ancient Earth when the chances were so slim.

And then we would sing the names. Those who could only continue as whispers on our lips. The beloved who did not make it.

I often find myself reciting Eliezer's ode to Terry Pratchett:

Even if the stars should die in heaven,
Our sins can never be undone.
No single death will be forgiven
When fades at last the last lit sun.
Then in the cold and silent black
As light and matter end,
We'll have ourselves a last look back
And toast an absent friend.

I am deeply sorry for your loss.

All the more aching for what might have been, had things been different.

We are all doing the best we can.

Even you, in your failures.

And me, in mine.

And it is heartbreaking, the cost of our learning. That sometimes it comes at the eternal loss of a beloved.

But all we can ever do is our best.

Setting aside our anger and self-blame,

out of deep heartfelt caring

we shall build a kinder world.

Or we shall die trying.

Comment by Valentine on Rationality !== Winning · 2023-07-24T03:50:15.574Z · LW · GW

I find this refreshing. It rings true. It feels like the kind of North Star we were groping toward in early CFAR but never landed on.

This in particular feels clarifying:

Rationality is the study (and applied skill) of finding cognitive algorithms that form better beliefs and make better decisions. Sometimes this is the appropriate tool for the job, and sometimes it's not. 

I find myself breathing with relief reading this. It has the flavor of a definition that could use some iterations. But as it stands, it strikes me as (a) honoring the spirit of the discipline while (b) not overblowing what it can do or is for. Like part of the clarity is in what the Art isn't for.

Thank you for writing this.