Comment by vedrfolnir on Heads I Win, Tails?—Never Heard of Her; Or, Selective Reporting and the Tragedy of the Green Rationalists · 2019-09-29T03:19:00.406Z · LW · GW

(This post is important enough that I'm breaking my commitment not to post until a certain time in the future.)

The model here strikes me as the correct *sort* of model, but deserving of substantial complication. Two complications in particular seem clear and relevant to me.

First, will the smart sincere idealists be simply *misled?* Given that this hypothetical imperfect rationalist space exists within Green territory, deviations from the Overton ratio will be punished by Greens *both inside and outside* the rationalist space; as such, it could (entirely unintentionally, at least at first) serve to *reinforce* Green partisan hegemony, especially if there's a large imbalance between the abilities of Greendom and Bluedom to offer *patronage*.

We already know from history that regimes may become so... self-serving and detached from reality, as one could put it... that they'll feel the need to actively select against smart, sincere idealists, or any permutation thereof. Loyalty to anything but the regime may be seen as an inefficiency and optimized away.

As a result, it could be useful for Green partisans to keep such spaces around, albeit low-prestige and generally reviled. Partisans also have an interest in identifying the sincere and the idealistic, but for precisely the opposite reasons. (Cf. the Hundred Flowers Campaign.)

Second, the neat division of truths into Green, Blue, and Gray rings unconvincing to me. Consider the Greens and Blues as having reality maps: certain things directly benefit their reality maps, certain things directly harm those maps, and certain things are neutral. (To pick on Zoroastrianism: the reality of Ahura Mazda or Angra Mainyu would be in the first category, a genealogical account of Zoroastrian doctrine in the Nietzschean sense would be in the second, and the contents of a randomly selected academic journal in the field of (say) accounting would, I assume, be almost entirely in the third.)

If we multiply the three categories of the Greens by the three categories of the Blues, we get nine options, not three. If we make certain assumptions about Green-Blue conflict, we can reduce this somewhat, and posit that anything that is beneficial to one side but seemingly neutral to the other in fact benefits the first at the expense of the second.

But this leaves five possibilities, not three! In addition to [+Green -Blue], [-Green +Blue], and [0Green 0Blue], we have [+Green +Blue] and [-Green -Blue]. Would Blues and Greens not fear displacement by something outside their union?

Comment by vedrfolnir on Attacking enlightenment · 2018-09-30T19:31:29.456Z · LW · GW

What's the value proposition of enlightenment?

If I have a choice between taking up organized religion and going to church or taking up spirituality and following empirical instructions to scale the mountain of enlightenment, why should I do the latter instead of the former?

What's the common theme in all these books? I don't see it. Impro contains some useful exercises, although I think most of the value in the book would come from people getting together IRL and actually doing them, and I haven't heard of anyone doing this. (I tried to get someone whose social network is much bigger than mine to make this happen, but then she moved to the Bay.) But it's about developing acting skills, not Buddhism...

Comment by vedrfolnir on Open Thread September 2018 · 2018-09-12T06:37:48.144Z · LW · GW

The Wikipedia article on it does.

Comment by vedrfolnir on A Dialogue on Rationalist Activism · 2018-09-12T06:25:39.213Z · LW · GW

These are good questions.

0. Are "we" the sort of thing that can have goals? It looks to me like there are a lot of goals going around, and LW isn't terribly likely to agree on One True Set of Goals, whether ultimate or proximate.

I think one of the neglected possible roles for LW is as a beacon -- a (relatively) highly visible institution that draws in people like-minded enough that semirandom interactions are more likely to be productive than semirandom interactions in the 'hub world', and allows them to find people sufficiently like-minded that they can then go off and do their own thing, while maintaining a link to LW itself, if only to search it for potential new members of this own thing.

My impression of internet communities in general is that they tend to be like this, and I don't see any reason to expect LW to be different. Take Newgrounds, another site formed explicitly around productive endeavors (which has the desirable (for my purposes here) property that I spent my middle school years on it): it spawned all sorts of informal friend groups and formal satellite forums, each with its own sort of productive endeavor it was interested in. There was an entire ecosystem of satellite forums (and AIM/MSN group chats, which sometimes spawned satellite forums), from prolific NG forum posters realizing they had enough clout to start their own forum so why not, to forums for people interested in operating within the mainstream tradition of American animation, to a vast proliferation of forums for 'spammers' who were interested in playing with NG itself as a medium, to forums for people who were interested in making one specific form of movie -- wacky music videos, video game sprite cartoons, whatever. And any given user could be in multiple of these groups, depending on their interests -- I was active on at least one forum in each of the categories I've listed.

(As an aside: I say 'spammers' because that's what they were called, but later on I developed enough interest in the art world to realize that there's really no difference between what we did and what they're doing. (The 'art game' people would do well to recognize this -- they're just trolls, but trolling is a art, so what the hell.) There were also 'anti-spam' forums, but I brought some of them around.)

1. As for classical LW goals, the AI problem does seem to have benefited quite a bit by ethos arguments. I'm not sure if "our goals" is even the type of noun phrase that *can* have semantic content, but cultivating general quality seems like a fairly broad goal. A movement that wants to gain appeal in the ways I've outlined will want its members to be visibly successful at instrumental rationality, and be fine upstanding citizens and so on.

2. I don't think I'm smarter than Ben Franklin, so my advice for now would be to just do what he did. At a higher level: study successful people with well-known biographies and see if there's anything that can be abstracted out. I notice (because Athrelon pointed it out a while ago) that Ben Franklin, C.S. Lewis, Tolkien, Thiel, and Musk have one thing in common: the benefit of a secret society or something like it -- the Junto, the Inklings, or the Paypal Mafia.

Comment by vedrfolnir on Zetetic explanation · 2018-09-12T04:26:09.629Z · LW · GW

I'm not a biologist, but I think it would be pretty difficult to tell whether fruits are intended to encourage animals to eat them or to protect the inner seed. But the energy in an avocado is primarily stored as fats, and it's generally thought that they were eaten by now-extinct Central American megafauna. (And it's common to stick avocado seeds with toothpicks to get them to sprout...)

There's also the chili pepper, but I don't know if anyone's studied digestion of pepper seeds in birds (which aren't sensitive to capsaicin) vs. mammals (which are). It may be that chili peppers evolved to deter mammalian but not avian consumption because the mammalian digestive tract is more likely to digest the seeds, rather than (as the common explanation has it) because birds disperse the seeds more widely.

Comment by vedrfolnir on Open Thread September 2018 · 2018-09-12T04:01:41.132Z · LW · GW

It started as the leftist alternative to Conservapedia.

Comment by vedrfolnir on A Dialogue on Rationalist Activism · 2018-09-12T03:43:15.406Z · LW · GW
How do we (second) convince others, and (first) establish for ourselves, that we’re different? What can we offer to prospective joiners that cannot be offered by other movements (i.e., what can we offer that constitutes an unfalsifiable signal that we are the “true path” to the “good ending”, so to speak)?

I came to this article having just read one about Donald Trump's response to the 9/11 attacks, which mentioned that Trump saw them from the window of his apartment. The WTC attacks happened at around 9 AM, the start of the standard workday; but he had decided to stay in his apartment later than usual to catch a TV interview with Jack Welch, the former CEO of General Electric.

I thought that was interesting. Welch is well-known in the business world, and at least was once well-regarded. I have one of his books, although I haven't read it yet.

Now, the problem of how to convince people to pay attention to a memeplex is a problem Less Wrong has. Jack Welch, not so much. I saw his book at a thrift store, had some idea of who he was, and figured it'd be worthwhile to buy it. Donald Trump heard that he'd be on TV, knew well (I assume) who he was, and figured it'd be worthwhile to watch the interview. We aren't on TV.

Why not?

Maybe it's because we aren't Jack Welch.

We've all read our Aristotle, right? Our marketers come up with plenty of logos and pathos. Ethos, not so much. But it worked for Jack Welch...

There's an important difference between the alien's initial sales pitch and the problem of recruiting people to Less Wrong. The alien is a representative of an advanced civilization, offering a manual for uplifting the human race -- so there's a solution to widely advertising it that will only work if the manual does: simply distribute the manual to a few hundred people around the world who are highly motivated to do well in life. Once they've learned it, applied its contents, and become wildly successful CEOs of General Electric or whatever, some of them will (almost certainly) make it known that their success is due to their mastery of the contents of a book...

But the book doesn't actually exist, we aren't hot-shit enough to recruit through ethos (why not? could it be that we're failing? could it be that we're failing so badly that our startups try to write their own payroll software?), and our sales pitches are pretty bad. I noticed so many of our quality people leaving, and so much lack of interest in *actually winning*, that I stopped paying attention myself -- I only saw this post because it was linked on Twitter.

Before asking what LW can offer to prospective joiners that can't be offered by other movements, ask if it *has* anything like that. I don't think it does, and I don't think it's in a position to get there.

Comment by vedrfolnir on How many philosophers accept the orthogonality thesis ? Evidence from the PhilPapers survey · 2018-06-21T19:54:13.185Z · LW · GW

I don't think the orthogonality thesis can be defined as ~[moral internalism & moral realism] -- that is, I think there can be and are philosophers who reject moral internalism, moral realism, *and* the orthogonality thesis, making 66% a high estimate.

Nick Land doesn't strike me as a moral internalist-and-realist (although he has a Twitter and I bet your post will make its way to him somehow), but he doesn't accept the orthogonality thesis:

Even the orthogonalists admit that there are values immanent to advanced intelligence, most importantly, those described by Steve Omohundro as ‘basic AI drives’ — now terminologically fixed as ‘Omohundro drives’. These are sub-goals, instrumentally required by (almost) any terminal goals. They include such general presuppositions for practical achievement as self-preservation, efficiency, resource acquisition, and creativity. At the most simple, and in the grain of the existing debate, the anti-orthogonalist position is therefore that Omohundro drives exhaust the domain of real purposes. Nature has never generated a terminal value except through hypertrophy of an instrumental value.

This is a form of internalism-and-realism, but it's not about morality -- so it wouldn't be inconsistent to reject orthogonality and 'heterogonality'.

I recall someone in the Xenosystems orbit raising the point that humans, continuously since long before our emergence as a distinct species, existed under the maximal possible amount of selection pressure to reproduce, but 1) get weird and 2) frequently don't reproduce. There are counterarguments that can be made here, of course (AIs can be designed with much more rigor than evolution allows, say), but it's another possible line of objection to orthogonality that doesn't involve moral realism.

Comment by vedrfolnir on Monopoly: A Manifesto and Fact Post · 2018-06-20T18:42:05.067Z · LW · GW
While you can't just try to transfer the effect of Coca-Cola's branding to your new product, I think you can, in fact, try to compete on branding.

La Croix did this. It's just flavored seltzer, the same as the 59c store-brand bottles, but it became wildly successful. What's more, it had been around for a while before becoming successful.

What did they do?

The first MAI study identified a highly-attractive target segment of prospective sparkling water users not at all interested in the Perrier brand and its “snobbish / expensive / for special occasions” positioning
Among package designs evaluated, MAI research led to recommendation of the design considered least appealing by the Heileman Marketing Group. The MAI-recommended design:
a. Promoted an “all occasion” image
b. Offered strong LaCroix name presence
c. Used elements that were most consistent with water imagery to the newly-identified target segment
Another unexpected research result was the surprising consumer enthusiasm for sparkling water in cans, a packaging idea that had not yet been introduced in this category. LaCroix’s subsequent introduction of sparkling water in cans allowed the brand to capture the lion’s share of new category growth from this innovation

I don't think that's the whole story. La Croix was originally positioned as an alternative to Perrier, whereas now (maybe as a result of the packaging in cans) it's positioned as an alternative to soda. And the copy on the box is pretty distinctive -- "calorie-free", "innocent" and so on. (It isn't quite grammatical, but that must be intentional. Trying to affect a European accent?)

There's a plausible narrative where La Croix succeeded because no one else had tried packaging seltzer in cans, but there's also a plausible narrative where it succeeded mostly because of its unusual branding.

If pressed, I'd favor the first -- Poland Spring also has a line of expensive brand-name flavored seltzers, but the bottles are a little unwieldy, not the sort of thing you'd pack with a work lunch. But I'm not in the target audience for its branding, so.

Comment by vedrfolnir on Monopoly: A Manifesto and Fact Post · 2018-06-20T18:21:29.576Z · LW · GW
Once a company reaches a monopoly position, its incentive structure is to suppress all innovation that does not improve its core business.

Once an actor reaches uncontested dominance, its incentive structure is to suppress all change that does not improve its position.

In my more paranoid moments, I suspect there's something like this going on in general: American power actors want stagnation and fear change, because change can be destabilizing and they're what would be destabilized. This is obviously true in the case of cultural power, but I'm not sure how it would extend beyond that.

Comment by vedrfolnir on Duncan Sabien: "In Defense of Punch Bug" · 2018-05-16T19:02:16.950Z · LW · GW
You're just not going to convince me that playful cover for hitting people out of the blue is OK.

Yes, that's the operative filter.

I have a pretty different class background from most LW posters (think "banlieue"), and "social ownership of the micro" reads to me like the fable of the princess and the pea. The egalitarianism of the lower classes is that, since not everyone can insist that the single pea be removed from under their twenty mattresses, no one is allowed to -- and instead, you're required to become the sort of person who doesn't even notice it.

Another operative factor is that the appearance of being useful if the shit hits the fan is a desirable trait. No one likes a weakling, and squeamishness at the sight of blood is decidedly uncool. But people also like people who are socially useful; and since you can tell she's a princess because she notices the pea, that pattern is countervailing.

Comment by vedrfolnir on Terrorism, Tylenol, and dangerous information · 2018-05-16T18:14:51.922Z · LW · GW
For instance coca cola was made with actual coca (the plant cocaine alkaloids are derived from) and sold as a cure for headaches.

Coca tea is still in use in parts of South America. I've been told it isn't really comparable to cocaine. Wikipedia is under the impression that there's about 6x as much cocaine in a cup of coca tea as a line.

I've never had coca tea, but I can buy that doing cocaine is a little like what snorting 600mg of pure caffeine would be like for someone with no prior exposure to caffeine. (I don't recommend either at all.)

How much cocaine was in the original Coca-Cola recipe? Allegedly, the original recipe had 3 drams coca extract to 2.5 gallons of water, whatever that means.

Comment by vedrfolnir on Duncan Sabien: "In Defense of Punch Bug" · 2018-05-07T10:16:30.904Z · LW · GW
I'm truly baffled that people would become very self-conscious of all the small unease of everyday life and then choose to elevate them as major inconveniences. It's a bit like discovering who holds your chains and redoubling in bondage and obedience to this silent master.

Nobles can take offense at peasants, but peasants can't take offense at nobles.

Peasants are expected to take care not to offend nobles, but nobles aren't expected to take care not to offend peasants.

Maybe it's a bit like that.

(More generally, we can imagine a sort of "metaperennialist" framework, whereby there are, for whatever reason, common human behavioral modules that can be activated when certain conditions obtain, even if no one involved is thinking in terms of these modules and in fact they all think they're doing something completely different. (Cf. standard perennialism, whereby there are metaphysical truths underlying all religions, which can be mystically experienced even when no one involved is thinking in terms of these truths and in fact they all think they're being visited by the Holy Spirit or talking to Jibril or whatnot.) One advantage of this framework is that it can easily explain why people would choose to pay such attention to the micro -- and why certain people would make this choice, and certain others would not. Frankly, the people who pay the most attention to the micro tend to remind me of Captain Aguilera.)

Comment by vedrfolnir on Predicting Future Morality · 2018-05-07T09:52:29.343Z · LW · GW

Decreasing general propensity for violence, increasing refinement of social control technologies, increasing class stratification, the replacement of liberal with progressive justifications for institutions (e.g. the state), and internet communication technology (most notably, Google and social media) will result in the emergence of an ethic of nobility and peasantry, unless the current sharp correction goes through. The new noble class will not correspond well to any existing economic class, which will be a source of conflict for as long as this remains the case.

As life shifts from rural/frontier communalism (mutual support, barn-raising etc.) to atomized urbanism and Malthusian class competition, Christian forgiveness and the Quaker Inner Light will be replaced with an attitude closer to Zhang Xianzhong than to anything known from the West. The attitude toward local strangers may not necessarily deteriorate -- I don't think we'll see anything more like samurai killing random peasants to test their blades than what we already see in America -- but the nobility will regard the peasantry with disdain and fear, and each other as evil unless useful. On some level, they'll know that the peasantry might rise up collectively and overthrow them (so they must be hated, feared, controlled, and suppressed); that individual peasants might rise to noble status (so they must be hated and kept down); and that all the other nobles are, in a Malthusian sense, making their life worse by existing as nobility, and that their risk of downward mobility is high (so any given noble will hate all the other nobles that aren't directly useful to himself and want them expelled from the nobility).

These are the things I think we're already seeing.

Comment by vedrfolnir on Give praise · 2018-05-02T20:55:45.384Z · LW · GW
This seems to imply that you think the current amount of “social capital” that people are being “awarded” is inaccurate (in the sense of being incommensurate with their achievements, or… something like that?). Is this, indeed, what you meant? And if so, on what do you base this?

I'm not ialdabaoth, but "social capital isn't awarded commensurately with achievement" seems accurate.

We're more like a social group than a corporation. Corporations have well-defined goals, metrics, and so on that they can take into account when awarding people, and have incentive to keep morale high. Social groups have none of that, and instead reward people based on how shiny they are. It seems to me that we're much more willing to reward people for being shiny than for corporation-like achievements.

(Some of this is probably because social groups and corporations have different incentives on tap. You won't get more friends and become more attractive by building things, and you won't get a raise for having a shiny Tumblr brand. Then again, you can get praise for both -- although it'd be a little incongruous to be praised in a corporation for social-group stuff or vice versa.)

From where I'm standing, the incentives point strongly in the direction of social-group stuff rather than corporation stuff. Being shiny rather than building things. If we want more things to be built, the incentives have to change so more people decide they're better off building things. But this might be hard to do, at least in the case of building local things, because local things are less legible outside the locality than internet shininess is. (Probably also than IRL shininess -- gossip travels faster and draws a bigger audience than status reports.)

(Of course, different people have different levels of building ability and different levels of shininess. Maybe we could follow the meat/brains/class/etc. deal and talk about the RPG stats of "grit", "tech", and "shine". If people are just following social incentives, a marginal change in favor of building will move the line on the grit + tech vs. shine plot, but the people who don't build will still tend to be shinier than the people who do. Maybe we need an RPG stat of "care" to normalize against here. Whatever.)

It also seems to me that we're an unusually low-praise group, and that higher-praise subsets tend to be more socially inclined.

Comment by vedrfolnir on Thoughts on the REACH Patreon · 2018-05-02T20:13:27.626Z · LW · GW
Suppose a group performs some task which is not a one-off, but iterated (or performs tasks sufficiently similar to some previous tasks). Practice makes perfect (in various ways which needn’t be enumerated here), and thus the “amount of leadership” required to complete the task will decrease over time. Will the “amount of leadership” required to apportion rewards also decrease over time? Why or why not?

V'q thrff gung vg qbrf. Vs abguvat ryfr, cerprqrag graqf gb pneel abamreb jrvtug.

Comment by vedrfolnir on Thoughts on the REACH Patreon · 2018-05-02T20:09:33.979Z · LW · GW

My day job is, essentially, "grunt". I work with about 30 other people. I can immediately think of two leader-types among the grunts -- three if I count someone who recently quit. I used to work a different shift, and there were no leader-types among the grunts there. There are a few more people who I'm pretty sure could be leader-types if they wanted to, but don't want to.

Small sample size, I know, but one ought to test these things against daily life, and by that test 1/30 seems to be in the right ballpark.

That said, things like grunt jobs and (I assume; I've never played any) MMORPGs probably lend themselves more easily to leadership opportunities than things like rationality -- there are different sorts of leadership called for.

In the one case, there are concrete and well-defined goals to be met, and there's domain-specific knowledge accumulated mostly through experience that needs to be applied in order to meet those goals, and leadership entails being generally recognized as 1) having a sufficient accumulation of domain-specific knowledge to know what has to be done to meet those goals, know what to do in most situations that will arise, and probably be able to figure something out in most of the rest of the situations, 2) not a prick.

In the other case... I'm not really sure what leadership in the ratsphere calls for, but it's probably not that. For one thing, we don't have concrete and well-defined operational goals; for another thing, we don't even have much general agreement on _strategic_ goals, although there are subsets of the ratsphere that do.

Comment by vedrfolnir on Eight political demands that I hope we can agree on · 2018-05-02T19:39:35.520Z · LW · GW

Not IME.

Incidentally, tobacco products aren't an unqualified vice the way alcohol is sometimes argued to be. (I also disagree with that assessment WRT alcohol, but the benefits are smaller and the harms are larger there than they are with tobacco.) They're better seen as general-purpose OTC psych meds -- they're surprisingly good at ameliorating a wide variety of flavors of things being a bit shit -- that have the unfortunate side-effect of dramatically increasing the likelihood that you get cancer.

Absent alternatives, this is probably a worthwhile tradeoff for many people, most of whom are not upper-middle-class sorts who, if neuroatypical, are so in upper-middle-class ways, because things are much less likely to be a bit shit for said sorts; so those sorts (who, if neuroatypical, are so in etc.) keep failing to pick up on this and deciding cigarettes should be banned.

Comment by vedrfolnir on Open Thread May 2018 · 2018-05-02T19:28:03.187Z · LW · GW
This seems transparently... failing to notice the entire west coast exists?

It exists, but it's less populous.

The Northeast and South together (by Census Bureau definitions) contain 55% of the population of the US. The West contains 24%, and the Midwest contains 21%.

But the West extends as far east as Colorado, and also contains Alaska and Hawaii, which should be excluded here; and the Midwest contains states like Ohio and Michigan, which aren't all that far out.

Unfortunately, Wolfram Alpha can't tell me how many people live within whatever distance of Berkeley or DC, so I have to ballpark. A thousand miles seems like a reasonable number -- it's about a thousand miles from Boston to Atlanta, and that's about a three-hour flight. While you'd have to sleep on your parents' couch overnight, it wouldn't be a _huge_ excursion the way a cross-country flight is.

For Berkeley: the total population of California, Oregon, Washington, Nevada, Idaho, Utah, and Arizona is about 65 million people. (I can't even use Census _divisions_ here -- Alaska and Hawaii are in the same division as the West Coast.)

And for DC: the total population of the Northeast Census region, the South Atlantic and East South Central divisions of the South Census region, and the East North Central division of the Midwest census region is about 186 million people.

The list of states could be quibbled with -- maybe Colorado should count for the West, maybe Missouri should for the East -- but I doubt it'd make much difference. The underlying factor here is that population density drops off sharply a few meridians before 100° west and doesn't pick up again until you hit the Pacific.

Comment by vedrfolnir on Open Thread May 2018 · 2018-05-02T14:07:55.112Z · LW · GW

To be a little cynical, Berkeley has the community-hub advantage of imposing a strong selection effect: it's far from everything that isn't the West Coast, it's hideously expensive, and as a city it isn't all that great -- I know New Yorkers who tried to move out there and came back with a litany of horror stories to rival those of my friends in Baltimore. So only the hardcore (or people competent enough to land a SF tech industry job) move out there.

The East Coast, on the other hand, has a lot of nice cities and, for most Americans, isn't so far away. I moved to Boston from a location that's pretty far away in east-of-the-Appalachian-range terms, but I could still take a day trip (by plane) to visit my parents, which I couldn't do if I lived in Berkeley.

As much as I like Boston, I think there's an important advantage DC has over it: where Boston has students, DC has people who opted to take safe, cushy government jobs and now have a lot of intellectual energy and no channel for it. (It also has the advantage of being Where The Government Is, which might be important at some point, for some purposes or other.) And, while it's a worse city than Boston in many respects, there's more to do -- if I lived in DC, I'd try to put together a group for going to free concerts (of which there are many in DC) and so on, but I don't know of anything like that here.

Comment by vedrfolnir on Open Thread May 2018 · 2018-05-02T13:53:19.960Z · LW · GW

I'd be interested in a dedicated version of Shortform Feeds.

The blogosphere equivalent of this was the main/sideblog setup -- think SSC and Scott's Tumblr. This seemed to work well for a lot of people, myself included: if you have something that isn't quite substantial enough for a 'main' post, a quote from a book that you might want to link to later, or something like that, you just put it on your sideblog.

This might have been what Main vs. Discussion was intended to be on old LW, but it obviously didn't work out like that: insofar as something like the setup existed on LW, it was Discussion vs. Open Thread.

These feeds wouldn't be places for fun casual conversation, but that's what the asteroid belt of largely invite-only chatrooms and satellite forums are for, at least if the Age of Forums is any guide. (These days it'd probably be social media instead of satellite forums.) Communities tend to grow their own places for fun casual conversation -- I don't think this has to be engineered, unless the point is to ensure that at least some of it happens on If they're set up right, I think they'd be a lot like what existing sideblogs are like, which would IMO be a Good Thing.

I'm not a UI guy, so take this with a Dead Sea or two of salt, but my guess is that you'd need sideblog feeds to only be reachable from a link on someone's userpage for this to work. If there's a firehose, people might worry about polluting it; if sideblog posts are directly visible from userpages, people might worry about keeping their userpages looking pretty and respectable and so on. (In the traditional setup, main blogs are easily discoverable from sideblogs but not vice versa. That's trivial if you're on Tumblr or, but it probably wouldn't mesh well with how things work here, and I doubt it's strictly necessary. But some degree of distance from the agora probably is.)

Comment by vedrfolnir on Open Thread May 2018 · 2018-05-01T20:32:48.477Z · LW · GW
Should we be focusing on a second hub that aims to rival Berkeley in it's size and awesomeness?

I take Amazon's HQ2 as evidence that this general class of structure is workable.

I'd like to see one on the East Coast, but people here tend to end up in Berkeley.

Comment by vedrfolnir on Don't Believe Wrong Things · 2018-05-01T20:23:01.395Z · LW · GW
You use an example like the moon-landing where there's no value in believing in it

There's some value in believing in it. If you don't believe in it and it comes up, people might look at you funny.

One interesting difference between what we may as well call "epistemicists" and "ultra-instrumentalists" is that ultra-instrumentalists generally weight social capital as more important, and individual intellectual ability as less important, than epistemicists do. See here: most of the reputed benefits of belief in Mormonism-the-religion are facets of access to Mormonism-the-social-network.

Another interesting feature of ultra-instrumentalists is that their political beliefs are often outside their local Overton windows. Presumably they have some idea of how much social capital these beliefs have cost them.

Comment by vedrfolnir on Is Rhetoric Worth Learning? · 2018-04-08T18:19:48.899Z · LW · GW
Are you sure? I’ve met a lot of people (“average people”, not rationalists) who take the view of “yeah, he can talk real impressively, but it’s all bullshit, no doubt”. Many people like “simple talk”, i.e. speech that simply lays out facts, and are suspicious of impressive/skillful rhetoric.

Then the skilled orator will take that into account, speak simply, and avoid impressive or skillful rhetoric.

Marketers have noticed that some people are suspicious of slick corporate brands, but they haven't conceded those customers to local producers and small businesses and whatnot -- they've rolled out product lines that appeal to those people. Farmer's Garden pickles are a good example of this, although the Vlasic branding is maybe a little too visible.

Comment by vedrfolnir on The Moral Void · 2018-04-08T17:29:28.731Z · LW · GW
There is a courage that goes beyond even an atheist sacrificing their life and their hope of immortality.  It is the courage of a theist who goes against what they believe to be the Will of God, choosing eternal damnation and defying even morality in order to rescue a slave, or speak out against hell, or kill a murderer... 

I'm a little late here, but this sounds a lot like Corneliu Codreanu's line that the truest martyr of all is one who goes to Hell for his country.

Comment by vedrfolnir on [deleted post] 2018-03-22T17:51:44.975Z

Weber called it "charismatic authority", so there's another search term.

Comment by vedrfolnir on A LessWrong Crypto Autopsy · 2018-03-22T01:33:13.158Z · LW · GW

There's a project Scott proposed something like eight years ago that got started last weekend because someone posted a bounty on it.

Even if the bounty is just beer money, being able to profit financially by doing something feels qualitatively different from doing it for free.

A centralized registry of bounties would be useful. And there might even be a startup idea in there -- it's essentially Wesearchr for outsourcing instead of far-right gossip journalism.

Comment by vedrfolnir on What Are Meetups Actually Trying to Accomplish? · 2018-03-22T01:18:25.567Z · LW · GW

Right -- there are reasons why good art is mostly not outsider art. And I've completely dropped hobbies that I had been spending a lot of time on once I didn't have time to keep up with the associated communities.

Comment by vedrfolnir on The Jordan Peterson Mask · 2018-03-22T01:00:57.000Z · LW · GW

The only explanation I caught wind of for the parking lot incident was that it had something to do with tulpamancy gone wrong. And I recall SSC attributing irreversible mental effects to hallucinogens and noting that a lot of the early proponents of hallucinogens ended up somewhat wacky.

But maybe it really does all work out such that the sorts of things that are popular in upper-middle-class urban twenty-something circles just aren't anything to worry about, and the sorts of things that are unpopular in them (or worse, popular elsewhere) just are. What a coincidence!

Comment by vedrfolnir on [deleted post] 2018-03-22T00:46:26.527Z

Microculture and cohesion? Did you go to any particularly cohesive summer camps when you were young? If not, you might want to talk to someone who did.

I went to a few different CTY sites, and found that 1) my ranking of sites by quality matched up almost perfectly with the consensus, 2) these matched up almost perfectly with the extent to which the traditions (i.e. microculture) existed.

One thing that stands out to me is that I went to one site at which the traditions had almost completely died out. (Siena, if anyone out there remembers better than I do.) The story I heard was that the Catholic college didn't take too kindly to swarms of degenerate atheists watching Rocky Horror and so on on their campus and insisted that camp management do away with a lot of the traditions, the people who were the most into the traditions left for other sites in response, and with those few people gone, the traditions atrophied, and attendance at that site fell off a cliff. It shut down a few years after I went, and it deserved to go.

On the other hand, the site management was incompetent, so there's that too.

Comment by vedrfolnir on The Jordan Peterson Mask · 2018-03-10T23:31:42.672Z · LW · GW
Your line of reasoning re: Aumann feels akin to "X billionaire dropped out of high school / college, erg you can drop out, too". Sure, perhaps some can get by with shoddy beliefs, but why disadvantage yourself?

If people are pretty good at compartmentalization, it's at least not immediately clear that there's a disadvantage here.

It's also not immediately clear that there's a general factor of correctness, or, if there is, what the correctness distribution looks like.

It's at least defensible position that there is a general factor of correctness, but that it isn't useful, because it's just an artifact of most people being pretty dumb, and there's no general factor within the set of people who aren't just pretty dumb. I do think there's a general factor of not being pretty dumb, but I'm not sure about a general factor of correctness beyond that.

It seems probable that "ignore the people who are obviously pretty dumb" is a novel and worthwhile message for some people, but not for others. I grew up in a bubble where everyone already knew to do that, so it's not for me, but maybe there are people who draw utility from being informed that they don't have to take seriously genuine believers in astrology or homeopathy or whatever.

Point of clarification: are you claiming that rejecting religion provides no information about someone's rationality, or that it provides insignificant information?

In a purely statistical sense, rejecting religion almost certainly provides information about someone's rationality, because things tend to provide information about other things. Technically, demographics provide information about someone's rationality. But not information that's useful for updating about specific people.

Religious affiliation is a useful source of information about domain-specific rationality in areas that don't lend themselves well to compartmentalization. There was a time when it made sense to discount the opinions of Mormon archaeologists about the New World, although now that they've been through some time of their religiously-motivated claims totally failing to pan out it probably lends itself to compartmentalization alright.

On the other hand, I wouldn't discount the opinions of Mormon historical linguists about Proto-Uralic. But I would discount the opinions of astrologers about Proto-Uralic, unless they have other historical-linguistic work in areas that I can judge the quality of and that work seems reasonable.

If postrationality really did win, I don't know that it should have. I haven't been convinced that knowingly holding false beliefs is instrumentally rational in any realistic world, as I outlined below.

Postrationality isn't about knowingly holding false beliefs. Insofar as postrationality has a consensus that can be distilled to one sentence, it's "you can't kick everything upstairs to the slow system, so you should train the fast system." But that's a simplification.

Comment by vedrfolnir on The Jordan Peterson Mask · 2018-03-10T15:29:17.325Z · LW · GW

I'm not sure how to square "rejecting religion is the preschool entrance exam of rationality" with "people are pretty good at compartmentalizing". Certainly there are parts of the Sequences that imply the insignificance of compartmentalization.

I personally recall, maybe seven years ago, having to break the news to someone that Aumann is an Orthodox Jew. This was a big deal at the time! We tend to forget how different the rationalist consensus is from the contents of the Sequences.

Every once in a while someone asks me or someone I know about what "postrationality" is, and they're never happy with the answer -- "isn't that just rationality?" Sure, to an extent; but to the extent that it is, it's because "postrationality" won. And to tie this into the discussion elsewhere in the thread: postrationality mostly came out of a now-defunct secret society.

Comment by vedrfolnir on The Jordan Peterson Mask · 2018-03-10T14:56:27.372Z · LW · GW

Small groups have a bigger problem: they won't be very well documented. As far as I know, the only major source on the Junto is Ben Franklin's autobiography, which I've already read.

Large groups, of course, have an entirely different problem: if they get an appreciable amount of power, conspiracy theorists will probably find out, and put out reams of garbage on them. I haven't started trying to look into the history of the Freemasons yet because I'm not sure about the difficulty of telling garbage from useful history.

Comment by vedrfolnir on The Jordan Peterson Mask · 2018-03-09T22:19:10.692Z · LW · GW

One thing I'd like to see is more research into the effects of... if not secret societies, then at least societies of some sort.

For example, is it just a coincidence that Thiel and Musk, arguably the two most interesting public figures in the tech scene, are both Paypal Mafia?

Another good example is the Junto.

Comment by vedrfolnir on On the Loss and Preservation of Knowledge · 2018-03-09T21:58:55.027Z · LW · GW
Let’s say you are trying to understand what Aristotle would think about artificial intelligence. Should you spend time reading and trying to understand Aristotle’s works, or can you talk to modern Aristotelian scholars and defer to their opinion?

How much can a tradition tell you about the opinions of its founders?

To what extent do the various traditions of Christianity fit your definition of a living tradition? How much of the content of the most 'living' Christian traditions comes from Jesus, and to what extent does this content reflect Jesus' actual opinions or those of the early Christians, insofar as this can be known?

You can ask the same questions of any other tradition or genus of traditions. Confucianism, for example, or some school or other of Buddhism.

(Presumably, any difference between living traditions reflects some sort of change in at least one of them -- unless 'free variation' existed at an early stage, and different descendants converged on different resolutions of the variation.)

It's possible for traditions to be heavily modified upon contact with other traditions without syncretism taking place, and even with deliberate esoterogeny. I've read that Indonesian pagans, upon contact with Christianity and Islam, set about trying to reform their paganism along Abrahamic lines -- not with the aim of becoming more like Christianity and Islam, but with the aim of preserving their tradition and its distinctiveness from the Abrahamic religions, by reforming paganism into an equal of Christianity and Islam, a "serious religion". Of course, it just so happens that the models for "serious religion" are the Abrahamic religions.

Are there traditions that are unusually good at preserving their contents? How did these traditions preserve them? Of course, this requires having a definition of 'contents'. The Rig Veda had been around for a thousand years before writing came to India, and the earliest known manuscripts of it date to about fifteen hundred years after that (although there were almost certainly earlier manuscripts that decayed or were lost) -- but it was mostly unintelligible until modern historical linguistics, and the proposed interpretations were mostly off the mark. Even now, a lot of it is obscure. But the text is very well preserved.

Comment by vedrfolnir on The Jordan Peterson Mask · 2018-03-08T18:22:14.699Z · LW · GW
Not to sound glib, but what good is LW status if you don't use it to freely express your opinions and engage in discussion on LW?

Getting laid, for one thing.

And, you know, LW is a social group. Status is its own reward. High-status people probably feel better about themselves than low-status people do, and an increase in status will probably make people feel better about themselves than they used to.

Eric Hoffer was a longshoreman who just happened to write wildly popular philosophy books, but I think he'd agree that that's not terribly usual.

Comment by vedrfolnir on The Steampunk Aesthetic · 2018-03-08T18:07:56.472Z · LW · GW

This is at least in the same ballpark as something I've been trying to figure out how to articulate for a while.

There's some relation between people and what I hope we don't end up calling "punk objects". (Punk, of course, was all about DIY. In Communist countries, punks set up their own record-printing shops, and printed records on anything they could get their hands on -- most memorably, used X-ray film.) And there's a quality about the people who cultivate these sorts of relations. This might be what Heinlein was talking about when he said specialization is for insects.

The flip side of that is a sort of alienation. If you can't fix or modify your Apple, and have to struggle endlessly with it to get it to do anything it's technically capable of doing but that Apple doesn't want you to be able to do (or at least to do easily), is the Apple really yours? Do you own the thing?

What this reminds me of is the recent social-media habit of imbuing large media corporations with the power to dictate identitarian dignity -- for example, Elbonians petitioning Disney to make movies about Elbonia with Elbonians in them, so that Elbonians can feel better about themselves. Those Elbonians can't pop open the hood of their own group identity -- it belongs to Hollywood, and only Hollywood can tinker with its internals.

(I think Rod Dreher would take this a lot further, and would say something about Vaclav Benda.)

Now, what the Apple thing reminds me of is preschool. In one of the first preschools my parents sent me to, they had a System. You had to ask the teacher how to play with a toy before you could lay your hands on it, and you could only play with it on a tiny little rug with your name on it, and when you were done you had to roll up your rug and put it in the corner, and outside structured class time that was the only thing you could do. Grade school was more of the same. And now that I have a job and an apartment in a building owned by a faceless corporation, I spend forty hours a week tooting around with a laptop in a tedious, highly structured manner, and when I go home I feel like I'm in storage -- like a dolphin at Sea World after closing time. OK, show's over, go back to your cage.

There are a lot of things missing in the default urban life, but the thing you're talking about here is definitely one of them, and I'm not sure what, if anything, can be done about it. I think it's entirely possible that this is the result of an at least semi-deliberate human domestication process, and that resistance to it is difficult and socially costly.

Comment by vedrfolnir on The Jordan Peterson Mask · 2018-03-07T13:42:13.416Z · LW · GW

What does "deal with Being directly" mean?

Comment by vedrfolnir on The Jordan Peterson Mask · 2018-03-07T00:46:56.680Z · LW · GW

Assuming that the instrumental utility of religion can be separated from the religious parts is an old misconception. If all you need is a bit of sociological knowledge, shouldn't it be possible to just engineer a cult of reason? Well, as it turns out, people have been trying for centuries, and it's never really stuck. For one thing, there are, in startup terms, network effects. I'm not saying you should think of St. Paul as the Zuckerberg of Rome, but I've been to one of those churches where they dropped all the wacky supernatural stuff and I'd rather go to a meetup for GNU Social power users.

For another thing, it's interesting that Eliezer Yudkowsky, who seems to be primarily interested in intellectual matters that relate to entities that are, while constrained by the rules of the universe, effectively all-knowing and all-powerful, and who cultivated interest in the mundane stuff out of the desire to get more people interested in said intellectual matters, seems to have gotten unusually far with the cult-of-reason project, at least so far.

Of course, if we think of LW as the seed of what could become a new religion (or at least a new philosophical scene, as world-spanning empires sometimes generate when they're coming off a golden age -- and didn't Socrates have a thing or two to say about raising the sanity waterline?), this discussion would have to look a lot different, and ideally would be carried out in a smoke-filled room somewhere. You don't want everyone in your society believing whatever nonsense will help them out with their social climbing, for reasons which I hope are obvious. (On the other hand, if we think of LW as the seed of what could become a new religion, its unusual antipathy to other religions -- I haven't seen anyone deploy the murder-Gandhi argument to explain why people shouldn't do drugs or make tulpas -- is an indisputable adaptive necessity. So there's that.)

If, on the other hand, we think of LW as some people who are interested in instrumental rationality, the case has to be made that there's at least fruit we can reach without becoming giraffes in grinding epistemic rationality. But most of us are shut-ins who read textbooks for fun, so how likely should we think it is that our keys are under the streetlight?

Comment by vedrfolnir on The Jordan Peterson Mask · 2018-03-05T19:23:42.328Z · LW · GW

That does seem to be a popular option for people around here who have the right matrilineage for it.

Comment by vedrfolnir on The Jordan Peterson Mask · 2018-03-05T19:20:14.382Z · LW · GW

Right, that's a possible response: the sacrifice of epistemic rationality for instrumental rationality can't be isolated. If your epistemic process leads to beneficial incorrect conclusions in one area, your epistemic process is broadly incorrect, and will necessarily lead to harmful incorrect conclusions elsewhere.

But people seem to be pretty good at compartmentalizing. Robert Aumann is an Orthodox Jew. (Which is the shoal that some early statements of the general-factor-of-correctness position broke on, IIRC.) And there are plenty of very instrumentally rational Christians in the world.

On the other hand, maybe people who've been exposed to all this epistemic talk won't be so willing to compartmentalize -- or at least to compartmentalize the sorts of things early LW used as examples of flaws in reasoning.

Comment by vedrfolnir on The Jordan Peterson Mask · 2018-03-05T19:08:44.435Z · LW · GW

A correct epistemological process is likely to assign very low likelihood to the proposition of Christianity being true at some point. Even if Christianity is true, most Christians don't have good epistemics behind their Christianity; so if there exists an epistemically justifiable argument for 'being a Christian', our hypothetical cradle-Christian rationalist is likely to reach the necessary epistemic skill level to see through the Christian apologetics he's inherited before he discovers it.

At which point he starts sleeping in on Sundays; loses the social capital he's accumulated through church; has a much harder time fitting in with Christian social groups; and cascades updates in ways that are, given the social realities of the United States and similar countries, likely to draw him toward other movements and behavior patterns, some of which are even more harmful than most denominations of Christianity, and away from the anthropological accumulations that correlate with Christianity, some of which may be harmful but some of which may be protecting against harms that aren't obvious even to those with good epistemics. Oops! Is our rationalist winning?

To illustrate the general class of problem, let's say you're a space businessman, and your company is making a hundred space shekels every metric tick, and spending eighty space shekels every metric tick. You decide you want to make your company more profitable, and figure out that a good lower-order goal would be to increase its cash incoming. You implement a new plan, and within a few megaticks, your company is making four hundred space shekels every metric tick, and spending a thousand. Oops! You've increased your business's cash incoming, but you've optimized for too low-order a goal, and now your business isn't profitable anymore.

Now, as you've correctly pointed out, epistemic rationality is important because it's important for instrumental rationality. But the thing we're interested in is instrumental rationality, not epistemic rationality. If the instrumental benefits of being a Christian outweigh the instrumental harms of being a Christian, it's instrumentally rational to be a Christian. If Christianity is false and it's instrumentally rational to be a Christian, epistemic rationality conflicts with instrumental rationality.

This is the easy-to-summarize scaffolding of what I'll call the conflict argument. It isn't the argument itself -- the proper form of the argument would require convincing examples of such a conflict, which of course this margin is too small to contain. In a sentence, it seems that there are a lot of complaints common in these parts -- especially depression and lack of social ties -- that are the precise opposites of instrumental benefits commonly attributed to religious participation. In more than a sentence, lambdaphagy's Tumblr is probably the best place to start reading.

(I don't mean to position this as the last word on the subject, of course -- it's just a summary of a post-Sequences development in parts of the rationalist world. It's possible to either take this one step further and develop a new counterargument to the conflict argument or come up with an orthodox Sequencist response to it.)

Comment by vedrfolnir on The Jordan Peterson Mask · 2018-03-05T13:50:07.302Z · LW · GW

Have you been following the arguments about the Sequences? This issue has been covered fairly thoroughly over the last few years.

The problem, of course, is that the Sequences have been compiled in one place and heavily advertised as The Core of Rationality, whereas the arguments people have been having about the contents of the Sequences, the developments on top of their contents, the additions to the conceptual splinter canons that spun off of LW in the diaspora period, and so on aren't terribly legible. So the null hypothesis is the contents of the Sequences, and until the contents of the years of argumentation that have gone on since the Sequences were posted are written up into new sequences, it's necessary to continually try to come up with ad-hoc restatements of them -- which is not a terribly heartening prospect.

Of course, the interpretations of the sacred texts will change over the years, even as the texts themselves remain the same. So: why does it matter if the map isn't right in many areas? Is there a general factor of correctness, such that a map that's wrong in one area can't be trusted anywhere? Will benefits gained from errors in the map be more than balanced out by losses caused by the same errors? Or is it impossible to benefit from errors in the map at all?

Comment by vedrfolnir on The Jordan Peterson Mask · 2018-03-05T07:22:20.302Z · LW · GW

I wouldn't use rejection of religion as a signal -- my guess is that most people who become atheists do so for social reasons. Church is boring, or upper-middle-class circles don't take too kindly to religiosity, or whatever.

And is our community about epistemic rigor, or is it about instrumental rationality? If, as they say, rationality is about winning, the real test of rationality is whether you can, after rejecting Christianity, unreject it.

Comment by vedrfolnir on The Jordan Peterson Mask · 2018-03-04T20:04:22.244Z · LW · GW

Jordan Peterson is controversial, but "controversial" is an interesting word. Is Paul Krugman controversial?

Comment by vedrfolnir on The Jordan Peterson Mask · 2018-03-04T19:50:08.355Z · LW · GW

Oh, crypto-Discordianism. I haven't read Unsong, but does the Law of Fives show up anywhere?

Comment by vedrfolnir on The Jordan Peterson Mask · 2018-03-04T13:31:29.767Z · LW · GW

I've gotten a much more negative reception to fuzzy System 1 stuff at IRL LW meetups than online -- that could be what's going on there.

And it's possible for negative reception to be more psychologically impactful and less visible to outsiders than positive reception. This seems especially likely for culture war-adjacent topics like Jordan Peterson. Even if the reception is broadly positive, there might still be a few people who have very negative reactions.

(This is why I'm reluctant to participate in the public-facing community nowadays -- there were a few people in the rationalist community who had very negative reactions to things I said, and did things like track me down on Facebook and leave me profanity-laden messages, or try to hound me out of all the circles they had access to. With a year or two of hindsight, I can see that those people were a small minority and this wasn't a generally negative reaction. But it sure felt like one at the time.)

Comment by vedrfolnir on Boiling the Crab: Slow Changes (beneath Sensory Threshold) add up · 2018-01-16T08:18:27.325Z · LW · GW

Abstract instances of the crab-boiling effect can arise from changes that are above the sensory threshold. All that's necessary is for acclimatization to outpace innovation. Each individual step (and these abstract instances are often stepwise rather than continuous) can still register as weird, but if you get used to it before the next step happens, your 'weirdness threshold' doesn't continuously increase.

We've all seen Back to the Future, right? In 1955, Ronald Reagan was a supporting actor who'd just become the host of something called "General Electric Theater". His only political experience was as a former president of the Screen Actors Guild, and he wasn't even a Republican yet. The progression from there wasn't terribly weird: he quit working for GE to become more politically involved, campaigned against Medicare in 1961 and for Goldwater in 1964, boosted his profile enough on the Goldwater campaign that he ran for governor of California and won, tried to primary Gerald Ford and lost, and won against Carter in 1980.

Reagan's rise to the presidency is stepwise, and each step is above the sensory threshold, but you still get the crab-boiling effect in the end.

Comment by vedrfolnir on Melting Gold, and Organizational Capacity · 2017-12-22T08:41:42.385Z · LW · GW

What do you think the barriers to scale are? Just difficulty in finding people comfortable with improvising within the framework to that many people, or?

Comment by vedrfolnir on Mana · 2017-12-22T05:14:31.374Z · LW · GW

Oh, *that*.

When I was in college, I intuitively realized that severing myself from the Khala would have intellectual benefits, so I did. The problem was, the fruits of those intellectual benefits paid off socially, such that severing myself from the Khala seemed less and less attractive. Eventually, although I didn't realize it at the time, I started to cash out. (I live with my partner who I met on the basis of my intellectual reputation. The most recent one, that is. There were a few.) My depression mostly disappeared, and I'd gotten myself an alright social life, but... I just couldn't write anymore. And eventually I started to believe that my investment in high-variance strategies was misguided, and decided to 'become a normie'.

It was only after I realized I'd succeeded that it dawned on me that I'd made a mistake.

But "severing myself from the Khala" isn't quite the framing I'd use. It felt more like adopting a very high-variance strategy: there were a few people who greatly respected me and a few people who wanted to kill me, and to most people I was just illegible -- not that I cared about any of that. In a low-variance strategy, no one wants to kill you, but no one (or at least very few people) respect you, until you're old enough to have become respectable, to have gotten somewhere with the slow grind of life; and all your energy is consumed by the grind.

Most people who aren't currently drawing juju from their respectability could stand to be higher-variance. If you're intentionally sacrificing your own standing to make a point (and not to trade it in with more valuable standing from different people), you could stand to be lower-variance. There's a golden mean there, but undershooting variance seems to be a much more common error than overshooting it.