Open Thread, Feb. 20 - Feb 26, 2017

post by Elo · 2017-02-20T04:51:06.102Z · LW · GW · Legacy · 163 comments

Contents

163 comments

If it's worth saying, but not worth its own post, then it goes here.


Notes for future OT posters:

1. Please add the 'open_thread' tag.

2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)

3. Open Threads should start on Monday, and end on Sunday.

4. Unflag the two options "Notify me of new top level comments on this article" and "

163 comments

Comments sorted by top scores.

comment by tristanm · 2017-02-21T00:54:59.118Z · LW(p) · GW(p)

Hi LW, first time commenting on here, but I have been a reader / lurker of the site for quite some time. Anyway, I hope to bring a question to the community that has been on my mind recently.

I have noticed an odd transformation of my social circle, in particular, of the people whom I have basically known since I was young, and are about the same age as me. I'm wondering if this is something that most people have observed in other people as they moved into adulthood and out into the world.

I would say that ever since I was a teenager I considered myself a "rationalist". What that has meant exactly has of course been updated over the years, but I would say that my approach to knowledge hasn't fundamentally changed (like I didn't suddenly become a postmodernist or anything). As soon as I understood what science and empiricism were about, I knew that my life would revolve around it in some way. And, what made me very close to the people who would be my best friends throughout high school and college, is that they felt pretty much the same way I did. At least I very much believed they did. My happiest moments with them, when I was about 16 to 18, involved lengthy, deep, and enjoyable discussions about philosophy, science, politics, and current events. I was convinced we were all rationalists, that we were fairly agnostic about most things until we felt that we had come to well-argued conclusions about them, and were always willing to entertain new hypotheses or conjectures about any topic that we cared about.

Fast-forward about ten years, and it seems like most of those people have "grown out of" that, like it was some kind of phase most people go through when they're young. All important questions have been settled, the only things that seem to matter now are careers, relationships, and hobbies. That's the impression I get from my various social media interactions with them, anyway. There are no debates or discussions except angry political ones, which mostly just consist of scolding people, or snarky comments and jokes. Politically, most people I know have gone either hard-left or hard-right (mostly hard-left, since everyone I know grew up on the west coast). But what's striking to me is how hivemind-ish a lot of them have become. It's really impossible to have a good discussion with any of my old friends anymore. I realize that sounds a little complain-y, but what I emphasize is that this a particular observation about the people I grew up with, not the older people I've known like family members, and not the people in my current social circle.

Ok, sure, it's possible that I just picked bad friends back then. But I think this is a little bit unlikely, since the reason we were drawn together in the first place is our shared interests and similar way of thinking. But I feel like I have basically stuck to the same principles that I had even back then. I've tried to avoid becoming too deeply attached to any one subculture or "tribe" - and there have been many opportunities to do so. What makes me believe my observation might be a more common phenomenon is that it seems to be shared by the people I'm close to now. It appears to me that there is something that alters a person's psychology as they move into adulthood, and through college in particular. And that this alteration makes people less "rational" in a way. And whatever causes that is traumatic enough that it encourages people to cluster into groups of very like-minded individuals, where their beliefs and way of life feel extremely safe.

I'd also like to emphasize that I'm not saying that our views and beliefs have simply diverged. This has mostly to do with the way that people think, and the way that they communicate ideas.

I wonder if anyone else has had this observation, and if so, what the possible explanations might be. On the other hand, maybe I have gone through the same change in my psychology, but simply fail to notice it in myself.

Replies from: dogiv, Viliam, username2, satt, Strangeattractor, MrMind, ingive
comment by dogiv · 2017-02-21T17:34:25.719Z · LW(p) · GW(p)

I agree there's something to the exploration-exploitation view of people becoming more closed-minded. But don't be too quick to write it off as "people don't think carefully anymore", or simple tribalism. Some important questions really do get settled by all those late-night college debates, though often the answer is "I don't think it's possible to know this" or "It's not worth the years of effort it would take to understand at a more-than-amateur level."

People are recognizing their limitations and zeroing in on the areas where they can get the highest return on investment for their thoughts. That's a difficult thing to do when you're younger, because you don't have much to compare yourself to. If you've never met a physicist more knowledgeable than your 9th-grade science teacher, you might well think you can make big contributions to the theory of relativity in the space of a few weeks' discussion with your friends.

Similarly, when it comes to politics, the idea of considering every idea with an open mind can fall victim to the pressures of reality--some ideas are superficially appealing but actually harmful; some are nice in theory but are so far from what could reasonably be implemented that their return on investment is low. And because politics is so adversarial, many ideas that are promoted as novel and non-partisan are actually trying to sneak in a not-so-novel agenda through the back door.

Replies from: tristanm
comment by tristanm · 2017-02-21T19:43:21.298Z · LW(p) · GW(p)

That's an interesting thought. However, I tend to observe that most people do not take strictly agnostic positions on most things. In fact, it seems that people tend towards certainty rather than uncertainty. So I'm not sure that I'm seeing people tend to give up on questions they think are too difficult or that they don't have the expertise or time to really come to a conclusion on. From my perspective it seems that people really do fall into ideological camps where they believe a lot of matters have been completely settled and do not need further discussion.

An interesting sort-of reverse phenomenon that I've noticed, is that on matters where people really have more expertise, they actually tend to be a little more agnostic about and open to debate. So for example you might notice people having an in depth discussion on some aspect of software engineering, like a library or a framework, weighing the pros and cons of each and citing expert opinion - but on politics, which we understand even less about - you really don't see this at all.

comment by Viliam · 2017-02-21T11:50:52.302Z · LW(p) · GW(p)

I can imagine a few possible things that could have contributed.

First, being more open-minded when young and getting more close-minded as older is the usual way, not just for humans, but also for many animals. Kittens are more playful than adult cats. And "philosophy" is a way of playing with words and ideas, so naturally young people would play with different ideas (the smart ones with different smart-sounding ideas; the stupid ones with different simplistic ideas), and gradually settle on the One True Way of looking at things, as they stop being able to consider new ideas, and choose one of the known ones which seems to work best for them.

It makes sense from the "exploration / exploitation" point of view: When you have a lot of time ahead of you, and your opinions matter relatively little because you are in a relatively safe environment, it is good to explore and get new data. When time becomes scarce and potential mistakes costly, stick with the best of what you already know. Also, there is a trade-off between exploration and productivity; the time you spend playing with new ideas is the time you don't spend earning money or working on your dreams; which is okay for a teenager with low value at job market. For an adult, job and/or children reduce the time and mental energy they can spend on thinking about things unrelated to immediate survival.

Second, people try to fit in their environment. I am sorry if this sounds too cynical, but the change of your friends probably reflects the change of the environment they are currently trying to fit in, where "scolding people, or snarky comments" are the standards of communication. (Trying to do anything else in such environment would probably gain you some snarky comments, and trying to reflect on the situation would get you scolded, plus some extra snarky comments. Not being sufficiently far-left of far-right would gain you low status as insufficiently "woke" or whatever is the right-wing equivalent.) Consider yourself lucky that you are not living in such environment.

Third, there is a possibility that a part of what you observe is simply you growing up. That not only the other people are changing, but also you start observing things that you didn't notice before. I may be generalizing too much from my own example, but it was the case for me that the people whom I considered smart when I was a teenager, suddenly seem pretty stupid now. (Of course, the scary alternative is that this is just me achieving the One True Way of looking at things, unable to tolerate other views anymore.) For example, I used to be impressed by people who had "their own opinion" on theory of relativity or quantum physics. Then I learned something about these topics; and then I realized that most of what these people talk is pure bullshit, probably learned from a random pseudoscientific YouTube video. They still use the same strategy, and extend it to other topics; I am just not impressed by it anymore. Now that I have more information than I had as a teenager, I can see more ways how people can be wrong.

Also, if you formerly interracted with your friends in person, and now it's mostly online, that too makes things worse.

comment by username2 · 2017-02-21T01:43:11.296Z · LW(p) · GW(p)

Yes, this is absolutely normal, common experience. People get "set in their ways" in some point in their lives and it becomes easier to move a mountain than to have them change their mind. This is exactly why one of the very first parts of the EY sequences is How To Actually Change Your Mind. It is the foundational skill of rationalism, and something which most people, even self-described rationalists, lack. Really, truly changing your mind goes against some sort of in-built human instinct, itself the amalgamation of various described heuristics and biases with names like 'the availability heuristic' and '(dis)confirmation bias.'

comment by satt · 2017-02-21T23:02:17.198Z · LW(p) · GW(p)

The (speculative) explanation my mind immediately goes to: a combination of the you-are-the-average-of-your-5-best-friends heuristic, and the dilution of a selected social group when its members move into new environments.

Universities and workplaces, with unusual exceptions, are probably not going to select as aggressively for high rationality (however you define "rational" & "rationality") as your in-school social selection did. So (I suspect) when the people in your circle started expanding their own social networks during university and then at work, the average rationality of their friends & acquaintances went down. And because (insofar as a person and their behaviour are malleable) a person's influenced by the people they hang out with, that probably made the people you know/knew less rational, or at least less likely to behave rationally.

Replies from: Viliam
comment by Viliam · 2017-02-22T13:12:30.722Z · LW(p) · GW(p)

Something in your comment changed my... not exactly opinion, more like feeling... about comparing social life at school and at job.

Until now, I was thinking like this: At school you are thrown together with random kids from your neighborhood. But when you grow up, you choose your career, sometimes you even choose a different city or country, and then you are surrounded with people who made a similar choice. Therefore... not sure how to put this into words... your social environment at job is a result of more "optimization freedom" than your social environment at school.

But suddenly it seems completely the other way round: Sure, the job is filtering for people somehow, but maybe it doesn't filter exactly by the criteria you care about the most. For example, you may care about people being nice and rational, but you career choice only allowed you to filter by education and social class. So, more optimization, but not necessarily in the direction you care about. And then at the job you are stuck with the colleagues you get on your project. However, at school, you had the freedom to pick a few people among dozens, and hang out with them.

I guess what I am trying to say that if your criteria for people you want to associate with have a large component of education and social class, you will probably find the job better than school, socially; but if your criteria are about something else, you will probably find the job worse than school. (And university probably gives you the best of both worlds: a preselection of people, among whom you can further select.)

Replies from: 9eB1, satt
comment by 9eB1 · 2017-02-23T17:14:53.880Z · LW(p) · GW(p)

That is true for people who you are going to become friends with, but difference in negative environments is much bigger. If your job has a toxic social environment, you are free to find a new one at any time. You also have many bona fide levers to adjust the environment, by for example complaining to your boss, suing the company, etc.

When your high school has a toxic social environment, you have limited ability to switch out of that environment. Complaints of other students have extremely high bars to be taken into account because it's mandatory for them to be there and it isn't in the administrator's best interests. If someone isn't doing something plainly illegal it's unlikely you will get much help.

comment by satt · 2017-02-22T20:08:49.596Z · LW(p) · GW(p)

Yep.

The school → university transition might be the most interesting one WRT tristanm's question, because although it theoretically offers the best opportunity to select for rationality, in practice a lot of people can't or won't exploit the opportunity. I imagine even quite nerdy students, when deciding where to apply to university, didn't spend long asking themselves, "how can I make sure I wind up at a campus with lots of rationalists?" (I sure didn't!)

Replies from: Lumifer
comment by Lumifer · 2017-02-23T18:06:41.433Z · LW(p) · GW(p)

I don't know about rationalists but one big advantage of going to what's called a "highly selective college" is that your peers there are mostly smart. The same principle works for schools, except that the results are not as pronounced because the schools effectively use the wealth of the parents as a proxy.

comment by Strangeattractor · 2017-02-26T08:03:36.599Z · LW(p) · GW(p)

I think the impression you have of the people may have been influenced by seeing them primarily through social media. Have you talked to them in person? It might be different. The format of social media makes having nuanced discussions difficult, and emphasizes the more tribal posts.

Another thing to consider is that their priorities may have changed more than their approach to life. They may be applying empiricism to how to advance in a career, or how to be a good parent. There is a limited amount of time in a day, and they may have enough time to do only a few things well. Also, sleep deprivation, common among new parents, can make thinking clearly more difficult. Once children get older, parents get a bit of their balance back.

Replies from: tristanm
comment by tristanm · 2017-02-27T04:26:19.353Z · LW(p) · GW(p)

Interestingly, out of my original friend group, I am the only one who has gotten married and had a child. If anything, I have been forced to become more rational in order to cope with the added anxieties, lack of sleep, and stress.

comment by MrMind · 2017-02-21T09:14:43.356Z · LW(p) · GW(p)

I can offer a possible explanation (just one model though, you'll have to verify it for yourself).
Humans are by design far from rationality as we intend it: we evolved to function in a social environment of peers, and to make life and death decisions in the blink of an eye. The structure of our brain is such that we first make instinctive decisions, and then we justify them post-hoc. The aim of rationality where we try to first deliberate the truth from first principles and the conform our behaviour to those conlcusions, is totally alien to the way human beings usually work.
It is possible to change our mind by self-deliberation, but it is very difficult, with our own nature as an obstacle, and thus can be done only for a limited array of subjects and with enough resources at your disposal (such as a lot of time and a safe environment).
This might have been what happened to your friends: by concentrating on thriving in the social environment, they made more and more reliance on system 1 (the heuristic, quick-firing decision system) first, to the point of forgetting to exercise system 2 (slow and deliberate) first as you did when debating years ago.
The more interesting question I would say is this: why you never forgot?

Replies from: tristanm
comment by tristanm · 2017-02-21T19:12:25.702Z · LW(p) · GW(p)

That's a good question. I think what separates me from a lot of the people I surrounded myself with is that I tend to have always relied far more on system 2 than on system 1. The exact reason for this I'm not sure about, except that I've always felt that my system 1 has always lagged behind or has been deficient in some way relative to most of my peers. I've always felt very uncomfortable in social situations, high stress or fast decision-making environments, or when the demands to react quickly are quite high. I've always been a lot more comfortable in environments that allow me to think and work on a problem as long as I need to before I feel ready to commit to something. For that reason, I've come to rely on system 2 - like reasoning for a lot of tasks that would normally be done by system 1.

I think many people, once they transition to the environment in which navigating complex social structures becomes necessary, learn to rely mostly on system 1. This probably happens around the early adulthood phase, through college and into early career, when networking becomes very important. For various reasons, I found I didn't need to network very hard or build up a lot of social capital to find a career and a comfortable livelihood. I realize that this probably makes me very lucky - I am basically able to hold this outside-view position that allows me, in a way, to be a little more protected from certain biases that could have potentially been learned from trying to thrive in highly social environments.

comment by ingive · 2017-02-26T10:17:51.443Z · LW(p) · GW(p)

I've had the overall impression that the older you become the stronger you hold your beliefs, a metaphor can be the hardening of neural networks. I am making a relation right now between that and the part of the personality known as 'openness' which according to Roland R. Griffiths decrease as people become older.

“Normally, if anything, openness tends to decrease as people get older,” says study leader Roland R. Griffiths, a professor of psychiatry and behavioral sciences at the Johns Hopkins University School of Medicine.

http://www.hopkinsmedicine.org/news/media/releases/single_dose_of_hallucinogen_may_create_lasting_personality_change

Which is discussed here: http://lesswrong.com/lw/7wh/rationality_drugs/4xmw to http://lesswrong.com/lw/82g/on_the_openness_personality_trait_rationality/

So drop acid with your friends, or rather have an underground psychedelic therapy group with blindfolds, music and emotional support. You gotta do your research though on how to facilitate these kinds of experiences. This is only for educational purposes and in theory.

comment by Viliam_Bur · 2017-02-21T10:01:34.500Z · LW(p) · GW(p)

FYI, I just banned an account "kings11me" who didn't participate in the forum, but was sending the following private message to multiple users:

God bless you and thanks, how are you? Happy to meet you. I got your contact via this site, I seriously have interest to invest on a profitable business in your country, the money I want to invest was acquired from my church member, and then I was his financial adviser. The amount to invest is ($14.5 million US dollars) presently, but I’m the present Catholic Church leader in my parish, if you will like to assist me as a partner, you must have the fear of God? kindly indicate your interest, and all other details relating to the funds will be revealed to you as we progress on. Confidentiality contact my direct e-mail address (REDACTED@yahoo com or REDACTED@gmail com) also indicate your direct telephone number, when replying this mail, God will guide us and with good health Amen, God bless you and your family, Rev, Chris Madurai Okon.

(In other words, I am an evil villain who has just deprived MIRI of a possible $14.500.000 donation from a secret rationality benefactor masquerading as an ordinary spammer. Business as usual. Mwa-ha-ha-ha-ha!)

comment by skeptical_lurker · 2017-02-20T22:08:35.674Z · LW(p) · GW(p)

I think about politics far too much. Its depressing, both in terms of outcomes and in terms of how bad the average political argument is. It makes me paranoid and alienated if people I know join facebook groups that advocate political violence/murder/killing all the kulaks, although to be fair its possible that those people have only read one or two posts and missed the violent ones. But most of all its fundamentally pretty pointless because I have no desire to get involved in politics and I'm sure that wrt any advantages in terms of helping me to better understand human nature, I've already picked all the low hanging fruit.

So anyway, I'm starting by committing to ignore all politics for a week (unless something really earth-shattering happens). I'll post again in a week to say whether I stuck to it, and if I didn't, please downvote me to oblivion.

Oh, and replying to replies to this post are excepted from this rule.

Replies from: username2, satt, Gram_Stone
comment by username2 · 2017-02-21T01:27:31.824Z · LW(p) · GW(p)

although to be fair its possible that those people have only read one or two posts and missed the violent ones

Or they agree with some aspects of a group but not others. Surely you don't agree with every opinion voiced on LessWrong, do you? Not even all of the generally accepted orthodoxy either, I'm sure. If you claimed you did, I'm sure I could come up with some post by EY (picked for representing LW views, no other reason) that you would be insulted to think others ascribed to you. Worth thinking about.

Even in cases that appear to be clear cut fear or violence mongering it may be that they joined the group to have its messages in their news feed for awareness, because they refuse to flinch from the problem. How others choose to engage in social circles should be treated like browsing data from a library -- confidential, respected, and interpreted charitably. We wouldn't want to be making thought crime a real thing by adding social repercussions to how they choose to engage in the world around them.

Replies from: skeptical_lurker
comment by skeptical_lurker · 2017-02-21T01:54:16.522Z · LW(p) · GW(p)

All good points, in the general case - I myself frequently read about things I disagree with. However...

Even in cases that appear to be clear cut fear or violence mongering it may be that they joined the group to have its messages in their news feed for awareness, because they refuse to flinch from the problem.

That is more of a LW thing. Most normal people don't act like this, and the person I was thinking of certainly doesn't. Politics is about waving the flag for your tribe, and trying to actually understand the other tribe's point of view is like waving the enemy flag - treason! To show that they are loyal, many people seem to be adopting the maximally uncharitably point of view, or at least they are in the last few years.

Of course, its also possible that that is why some people are advocating violence - they wouldn't really want violence, and they certainly wouldn't personally assault someone, but they advocate violence because it shows more tribal loyalty then just advocating peaceful protest.

Replies from: username2
comment by username2 · 2017-02-21T02:16:26.061Z · LW(p) · GW(p)

I was going to remind you of the fundamental attribution error, but that isn't exactly what's going on here. Is there a name for the error of assuming the simplest possible explanation given the information available is correct, when it comes to human behaviour? Popsci aside, the simplest explanation you can come up with is usually not the case, because the other person is acting as a result of a lifetime of experiences that you have had at best only a small glimpse into. It's hard to evaluate exactly why they do what they do, without sitting themselves down on the couch for a few hours. If anyone knows what this error in analysis is called, I'm genuinely curious.

Replies from: skeptical_lurker
comment by skeptical_lurker · 2017-02-21T03:26:16.581Z · LW(p) · GW(p)

'Overuse of Occam's Razor?'

Anyway, I know that psychology is complex and the explanations I come up with are only my best hypothesis, not one that I would necessarily have >50% confidence in - I should have made that clear. Still, I have trouble thinking of other explanations for why intelligent, educated, friendly people claim to believe that about 50% - 95% of the population are evil?

Or that most old people deliberately vote for bad things because why should they care if they are going to die soon anyway?

Or that there is >50% probability that Brexit will literally lead to a neo nazi state in the UK within 10 years?

Or that the best way to defend democracy in the USA is to assassinate Trump and Pence... despite the fact that they were democratically elected and that the sympathy vote would push the USA far more to the right?

Or that someone would attempt to prove to me that a political party are evil, by showing me a meme saying that they are evil, as if messing about with photoshop confers truth, and then be unable to provide a single non-meme-based argument to support this assertion?

I mean, these beliefs are so crazy that if only one person were expressing these opinions I might worry that they are showing the early warning signs of some form of clinical paranoia. But its widespread, among people who otherwise seem functional.

Replies from: username2, Lumifer, math, gjm
comment by username2 · 2017-02-21T07:20:55.328Z · LW(p) · GW(p)

Well, you got me. I thought perhaps you were seeing things like "Islam is a violent religion" and inferring too much into it. But most if not all of those examples seem inexcusable if genuinely held. Although the original point stand that the person subscribing to the group might be doing so in response to more mundane writings, and they are not endorsing the more extreme writing, which may even have been done for shock value. I don't know.

Regarding the other point, it's not quite that Occam's razor is wrong, but rather having to do with ignorance of a complex system. "The simplest explanation is probably correct" is true when we have a sufficient number of facts in front of us to make inference. In most things in life this is the case, but human behaviour is complex enough to make that not generally true. I can make Occam's razor predictions about the underlying reason for my wife doing something, and maybe my closest friends or siblings. But not others -- their mental states are too complex, too dependent on things I don't have information on.

Anyway sorry to distract from your original question. I just wish there was a name and some literature regarding this bias because it seems relevant and important.

Replies from: skeptical_lurker
comment by skeptical_lurker · 2017-02-22T13:31:35.224Z · LW(p) · GW(p)

"The simplest explanation is probably correct" is true when we have a sufficient number of facts in front of us to make inference. In most things in life this is the case, but human behaviour is complex enough to make that not generally true.

However, I would say that even when dealing with high complexity and uncertainty, the simplest explanation is still usually the most probable hypothosis, even if it has <50% probability.

comment by Lumifer · 2017-02-23T01:12:35.158Z · LW(p) · GW(p)

Well, there is one simple explanation called bullshit.

A lot of people are willing to pronounce positions and statements that they will not be able to execute in reality.

Hopefully, most of them.

comment by math · 2017-02-22T06:13:47.457Z · LW(p) · GW(p)

Or that someone would attempt to prove to me that a political party are evil, by showing me a meme saying that they are evil, as if messing about with photoshop confers truth, and then be unable to provide a single non-meme-based argument to support this assertion?

If you want non-meme based arguments, try visiting fora that cater to people capable of engaging in non-meme based arguments.

comment by gjm · 2017-02-21T13:56:01.419Z · LW(p) · GW(p)

I don't know exactly what you've seen and therefore it's possible that the following fails to address it. But on the face of it username2's diagnosis seems very plausible. Not the bit about choosing to see violence-mongering stuff to keep one's awareness and opposition keen; that's taking steelmanning too far.

But: put yourself in the shoes of someone who is, as you put it, intelligent, educated and friendly, whose political opinions are generally leftish, and who is horrified by the rise of right-wing populism as exemplified by Donald Trump and Brexit and Marine Le Pen and so forth. These things alarm them and they want to surround themselves with ideas that point the other way, to reassure themselves that the world isn't entirely against them, etc. So they find a Down With Donald Trump And Brexit group on Facebook and join it. Some of the things it posts are extreme and violent; our hypothetical intelligent leftie deplores that, and would be happier affiliating with a large anti-rightist community that doesn't do that sort of thing -- but all the large anti-rightist communities have people in them who do that sort of thing, so they don't have much choice. Joining the group doesn't mean endorsing everything its members post.

(It's not as if the rhetoric of the less-pleasant parts of the political right is any nicer or more sensible than that of the less-pleasant parts of the political left. Intelligent educated friendly right-leaning folk can find themselves with some regrettable -- dare I say deplorable? -- bedfellows too.)

Replies from: Lumifer, skeptical_lurker
comment by Lumifer · 2017-02-23T01:16:54.429Z · LW(p) · GW(p)

intelligent, educated and friendly, whose political opinions are generally leftish, and who is horrified by the rise of right-wing populism as exemplified by Donald Trump and Brexit and Marine Le Pen and so forth.

So the first thing that someone should do is untwist her underwear.

The second thing is probably contemplate the meaning of the word "horrify". Horrified by Donald Trump? Really?

I assume that someone is not horrified by Putin. Or, say, Erdogan who is, right now, doing a classic 1937-type purge in his country. Or is she?

so they don't have much choice

Erm. I'm not going to believe that.

Replies from: gjm
comment by gjm · 2017-02-23T02:48:52.984Z · LW(p) · GW(p)

I assume that someone is not horrified by Putin. Or Erdogan [...]

It's a purely hypothetical someone, so who knows? FWIW the people I know who are horrified by the rise of right-wing populism (which is what I actually said, and which is not exactly the same thing as being horrified by Donald Trump as such) are no fonder of Putin and Erdogan than of Trump, so far as I can tell. (FWIW I am not exactly horrified by the rise of right-wing populism, but I don't like it at all and think it likely to do much harm; I think Putin and Erdogan are both substantially worse than Trump, but they're less surprising than Trump because Russia and Turkey have much stronger track records of awful awful leaders.)

I'm not going to believe that.

OK. (It turns out that what skeptical_lurker was describing was distinctly more extreme than in my scenario, so it's not terribly relevant how well that scenario actually matches reality. I'm not much of an expert on leftist Facebook groups; is there a good supply of such groups that don't have any extremist stuff in?)

Replies from: Lumifer
comment by Lumifer · 2017-02-23T16:47:19.583Z · LW(p) · GW(p)

the people I know who are horrified by the rise of right-wing populism

So, is this really a thing? I've heard a lot about that "rise of right-wing populism", but remain unconvinced that it actually exists. What I observe is that some left-wing ideas became less popular than the left wing expected and wanted them to be (which caused a massive hissy fit on the left).

In general, "populism" is one of those irregular nouns/adjectives: my ideology is (rightfully) popular but yours is populist. Since democracy is essentially a popularity contest, I tend to treat the label "populist" as a sour-grapes insult with little content.

So are we just talking about some general re-assertion of the right wing (which, despite the left's best efforts, continues to insist it's not dead yet) or do you think the "populist" moniker has some meaning?

Replies from: gjm, ChristianKl
comment by gjm · 2017-02-23T19:39:22.806Z · LW(p) · GW(p)

So, is this really a thing?

Dunno. There are quite a lot of politicians and political movements at the moment with the following characteristics:

  • They are more openly nationalist than it has generally been fashionable to be until recently. ("Right-wing".)
  • They are more appealing to "ordinary people" and less to "intellectual elites" than political groups previously in the ascendant. ("Populist".)
  • In particular, their signature policies are often ones "experts" would be very sniffy about but that sound good to a lot of people. ("Populist".)
  • They advocate economic policies whose consequences are probably better on the whole for the wealthy than for the poor. ("Right-wing".)
  • They get some of their support from being seen to be happy saying things that are "politically incorrect". ("Right-wing", "populist").

In the US, we have Donald Trump. "Make America Great Again", border walls, keeping Muslims out of the US; hugely more support among the less-educated than among the more-educated; advocates trade barriers and fierce border controls; favours large-scale tax cuts and abolishing Obamacare; grabs 'em by the pussy.

In the UK, we have UKIP and the rest of the anti-EU movement. Keeping scary foreigners out; support, again, strongly concentrated among the less-educated; leaving the EU has been widely predicted by economic experts to be a terrible idea; UKIP at least proposed tax and benefit changes whose first-order effect would have been a large transfer from poorer people to richer people; UKIP seems to attract an awful lot of people who are upset at, e.g., anti-black prejudice no longer being widely socially acceptable.

In continental Europe we have a whole lot of politicians and movements that are widely described as right-wing populist: the Front National in France, Vlaams Belang in Belgium, Golden Dawn in Greece, etc. I don't know enough about them to evaluate those claims, but they seem plausible. And they seem to be gaining rather than losing members over the years.

do you think the "populist" moniker has some meaning?

I do. I think it means something like "getting a substantial amount of their support from clear, simple ideas that a large fraction of the population find attractive but that those generally regarded as experts and intellectual elites are skeptical of, even those with broadly similar political leanings".

So, for instance, opposition to immigration tends to be a rightish thing and opposition to free trade tends to be a leftish thing; lots of people like these; but most experts on both "sides" tend to think they're a bad idea and will actually make almost everyone worse off.

some left-wing ideas became less popular than the left wing expected and wanted them to be

Sure, that happens sometimes. It doesn't look to me as if it always gets described as "populism". Cutting taxes on the rich and state benefits for the poor are things that the right tends to like and the left tends not to like; sometimes they are popular; but I don't generally see them described as populist. (Though of course sometimes people who are called populist for other reasons advocate them.)

And not everything people get called populist for is right-wing. One of the reasons I call Trump a populist is his advocacy for trade barriers and his opposition to treaties like the Trans-Pacific Partnership. On the whole, the TPP was more popular on the right than on the left; opposing it isn't particularly a rightist view (there are other things that make Trump a rightist rather than a leftist); but it's certainly populist.

So no, I reject the suggestion that "populist" just means "popular but I don't like it" or anything of the sort. (I am inclined to agree that the TPP was a bad idea, though I don't know enough about it for my opinion to be worth very much.)

Replies from: Lumifer
comment by Lumifer · 2017-02-24T16:53:17.952Z · LW(p) · GW(p)

Let's not get into the weeds of specific European or US policies. But the notion of "populism" is an interesting one (by the way, there's certainly left-wing populism as well, see e.g. Chavez). You defined it as:

getting a substantial amount of their support from clear, simple ideas that a large fraction of the population find attractive but that those generally regarded as experts and intellectual elites are skeptical of, even those with broadly similar political leanings

There are implications, or, perhaps, prerequisites to this definition. A big one is the distinction between "a large fraction of the population" -- basically, the masses -- and "experts and intellectual elites". Journalists love these terms, but they look a bit too jiggly-wiggly to me.

So there is some axis on which we can arrange people with one end being "the masses" and the other end being "the elites". By which criteria do we do that?

One obvious one is money. Masses are poor and elites are rich. There are problems here, though -- for example, successful Kansas farmers are not poor at all and yet people rarely call them elites. But a negative-wealth grad student in San Francisco is a bona fide member of the elites, isn't he?

Social status is not a good criterion because it's basically as synonym for the axis we're trying to define and I don't know how would one go about assigning "objective" social status levels to people or communities.

The traditional class distinction is still in play in Europe, to some degree, but is by and large absent in the US.

The city/country division is a good proxy in many ways, but by itself it's a different axis.

IQ, maybe? Masses are dumb and elites are smart? That's an interesting approach but it has scary consequences for the idea of democracy and whole égalité thing.

All in all, I'm not happy with the fuzziness of the masses/elites concept which is thrown around with wild abandon nowadays.

Moreover, you definition of populism assumes that the masses/elites distinction matters -- they like different things and, presumably, make different choices. And there is the implication that in this divergence the masses are "wrong" and the elites are "right". Again, there are interesting consequences to this idea.

Taking a look at this from a different side, populism -- espousing ideas appealing to the numerically dominant masses -- is a necessary and unavoidable part of democracy, specifically of getting elected. Every (serious) candidate is a populist. Granted, some are more so and some are less so, but the position of the "less so" ones is not very enviable -- since their ideas are less popular by definition, they are reduced to asking the masses to, basically, trust them to do the right thing even though the masses don't think it is the right thing to do. In practice this means you spend your campaign time kissing babies and proclaiming your love for, to use an American expression, motherhood and apple pie, while carefully tiptoeing around actual policy issues. Not really a great solution.

Replies from: gjm
comment by gjm · 2017-02-24T17:41:13.831Z · LW(p) · GW(p)

by the way, there's certainly left-wing populism as well

Of course I agree (perhaps Bernie Sanders is a more prominent, though less extreme, example than Chavez) but I find it interesting that you've leapt so readily from "right-wing populism: is it even a thing at all?" to "of course there's left-wing populism as well as right-wing populism".

Journalists love these terms, but they look a bit too jiggly-wiggly to me.

Non-technical terms are always a bit jiggly-wiggly. Possibly also wibbly-wobbly and maybe even timey-wimey on occasion. There are, as you say, lots of ways to distinguish "masses" from "elites", many of them correlate strongly with one another, and if you look at people who appeal to "masses" much more than to "elites" then I think you tend to get roughly the same people for a wide range of ways-to-distinguish.

If you want a specific proposal, here is one, but of course it's ad hoc and somewhat arbitrary. Imagine asking for each person in a country (1) how wealthy they are, (2) how smart they are, and (3) where a panel of 100 random people from that country would put them on a "masses/elite" scale, according to whatever principles they happen to prefer. Measure #1 and #2 as percentiles and #3 with a 0..100 scale. Take the average. There's your measure of eliteness.

(Wealth should be measured as NPV of assets plus expected income stream. You may notice that this handily puts that SF grad student somewhat higher than their negative net wealth might leave one to think.)

your definition of populism assumes that the masses/elites distinction matters

Nope. I can apply that definition just as well if it turns out that the distinction doesn't matter at all. But, as it happens, I think it's clear that the masses/elites distinction, fuzzy and ambiguous as it is, does matter -- not least because there are a lot of influential politicians doing well for themselves by explicitly telling the "masses" how they're being screwed over by the "elites".

the implication that in this divergence the masses are "wrong" and the elites are "right"

There is no such implication. (Though there are some reasons to suspect that on questions that admit of actually determinable right and wrong answers, the "elites" will be right more often than the masses.)

populism -- espousing ideas appealing to the numerically dominant masses

But that is explicitly not how I define populism. Some ideas appeal to (many of) the numerically dominant masses and also to (many of) the elites, and those are not populist ideas. What makes an idea populist is its differential appeal to the masses, by comparison with the elites.

Every serious candidate espouses policies that are meant to appeal to a large fraction of the population. What makes a candidate a populist is that they don't mind when those policies look terrible to the elites. It is not necessary to be a populist in this sense to be a serious candidate.

Replies from: Lumifer
comment by Lumifer · 2017-02-25T01:29:07.119Z · LW(p) · GW(p)

I find it interesting that you've leapt so readily from "right-wing populism: is it even a thing at all?" to "of course there's left-wing populism as well as right-wing populism".

I started with right-wing populism because that was the subject mentioned in the preceding comment, but the accent here was on populism, not on right-wing. I'm not convinced that the left-wing populism is a thing in exactly the same way.

if you look at people who appeal to "masses" much more than to "elites" then I think you tend to get roughly the same people for a wide range of ways-to-distinguish.

I don't know about that. How about Jeremy Corbyn and UKIP? Same "masses"?

this handily puts that SF grad student somewhat higher than their negative net wealth might leave one to think

That actually depends on his profession. If he's going to get a degree in underwater basket weaving, I don't think making coffee at Starbucks pays that much.

I can apply that definition just as well if it turns out that the distinction doesn't matter at all.

Can you? If the distinction doesn't matter, there's no way to appeal to the masses and annoy the elites at the same time, since there are no distinctions that matter.

There is no such implication.

If there were no such implication, the word "populist" would not have derogatory and condescending overtones. And yet it does.

What makes an idea populist is its differential appeal to the masses, by comparison with the elites.

Didn't you say above that you can apply the definition even if the difference doesn't matter?

But let's see. Any sane politician-to-be would espouse popular ideas. So what makes positions populist is not that the masses like them, but that the elites dislike them?

Does this mean that "populism" is a label for views elites dislike?

As an aside, I find it hilariously ironic how the left wing nowadays defends the elites and denigrates the stupid masses :-D

Replies from: gjm
comment by gjm · 2017-02-26T02:23:01.842Z · LW(p) · GW(p)

I'm not convinced that the left-wing populism is a thing in exactly the same way.

I am not sure it's consistent to say both "of course there's left-wing populism" and "I'm not convinced that the left-wing populism is a thing", but never mind.

Jeremy Corbyn and UKIP

I think you misinterpreted me. Of course there are populists of different sorts. But who they are doesn't depend much on exactly how you define "populist". Corbyn and UKIP both appeal more to the less-educated than the more-educated, appeal more to the poor than to the rich, etc.

That actually depends on his profession.

For sure. I think the typical underwater basket-weaving student is less "elite" than the typical computer science student, precisely because the latter is likely to be well off in 10 years and the former isn't. (Well ... for that exact same reason, maybe it's the more-elite-to-start-with people who can afford to go into underwater basket weaving; I'm not sure.) But I think all you're pointing out here is that "elite"-ness isn't the exact same thing as wealth, which I already agreed with.

If the distinction doesn't matter, there's no way to appeal to the masses and annoy the elites at the same time

Either you're equivocating between two notions of "mattering" or I misunderstood you the first time. With the notion you're apparently using now, I think it's obvious that the masses/elites distinction "matters", and -- one man's modus ponens is another's modus tollens -- the fact that it's possible to identify "populists" and distinguish them pretty well from other politicians is good evidence for that.

So what makes positions populist is not that the masses like them, but that the elites dislike them?

No, it's both. A position disliked by both the masses and the elites is not populist. (A politician who espouses too many of those won't last long, but I expect it's possible to survive with one or two, and having them doesn't constitute populism.)

the left wing nowadays defends the elites and denigrates the stupid masses

Not all of it. You yourself drew attention to Jeremy Corbyn not so long ago, and in the US there's Bernie Sanders. And by and large all that's happening is the same thing that always happens whoever's in power: their opponents attack them with whatever stick is ready to hand. If it happens that the people in power are on the right, their opponents will mostly be on the left; if it happens that the people in power can credibly be accused of stupidity and pandering and whatnot, their opponents will make those accusations.

If you mean there's some particular inconsistency when lefties denigrate "the masses", I'm not really seeing it. Sure, left-wing politicians tend to claim to be the defenders of the Ordinary People. But so do right-wing politicians; as you've pointed out more than once in this discussion, politicians of all stripes want to be popular. So I'm not sure why complaining about "the stupid masses" is any more inconsistent with leftists' avowed principles than with rightists'.

Replies from: Lumifer
comment by Lumifer · 2017-02-27T15:42:57.568Z · LW(p) · GW(p)

I am not sure it's consistent

The concept of populism certainly exists in the map. People talk about it, they point fingers at things and say "this is {right|left}-wing populism". My issue is whether this concept found a good joint in the territory to carve along.

For example, consider how you can treat a shadow as a thing in itself, or you can treat it merely as absence of light.

I think the typical underwater basket-weaving student is less "elite" than the typical computer science student

Well, yes, you do. I suspect the underwater basket-weaving student doesn't. Especially if his basket-weaving is called something like "How Picking Your Nose Is A Transformative Genderqueer Activity That Subverts Patriarchy-Imposed Rules".

the fact that it's possible to identify "populists" and distinguish them pretty well from other politicians

But that is exactly what I contest. See the map/territory distinction above.

I'm not sure why complaining about "the stupid masses" is any more inconsistent with leftists' avowed principles than with rightists'.

Because liberté, égalité, fraternité is a left-wing, not a right-wing motto.

Replies from: gjm
comment by gjm · 2017-02-28T12:24:00.642Z · LW(p) · GW(p)

My issue is whether this concept found a good joint in the territory [...] consider how you can treat a shadow [...]

Ah. That would have been clearer to me if you'd found a different way of expressing your concern than by saying 'I've heard a lot about that "rise of right-wing populism", but remain unconvinced that it actually exists.' I mean, if you want to say -- as you should! -- that shadows are better understood as regions where there's less light because of occlusion rather than as separate objects in their own right, I don't think "Shadows don't exist" is the best way to say it. (But it probably works better than doing the same for "right-wing populism" because it's more obvious that shadows at-least-kinda-exist, and therefore your audience is more likely to think "hmm, he probably means something cleverer than what he seems to".)

Well, yes, you do. I suspect the underwater basket-weaving student doesn't.

Quite possibly. Again, I'm not claiming (and so far as I can tell no one is claiming) that there is a single metric for "eliteness" that everyone completely agrees on, any more than there is for right-wing-ness or intelligence or good musical taste -- so I'm not sure how what you say is meant to be a counterargument to anything I've been saying. (It is meant to be a counterargument, right?)

But that is exactly what I contest.

Thought experiment: We pick 100 people from (let's say) the population of adult Americans and anglophone[1] Europeans with an IQ of at least 100 as measured by some arbitrarily chosen standard test. We give them all a brief description of populism, but attach some other name to it in the hope of avoiding pre-existing associations. The description does not name specific politicians or movements. Then we show each of these people a list of the 10 most prominent politicians in several Western countries, and give them a summary of their policy positions and a sample of their speeches (with translations as appropriate). We ask each person to say for each politician how well they fit the profile of Sneetchism or whatever we say instead of "populism".

[1] Anglophone so that we can give them all the same description. We could equally well pick just francophones or something, but there are more anglophones and I wanted to be able to include reasonably representative Americans too.

I claim that if we did that, we would find (1) good agreement between experimental subjects about how to score each politician and (2) good agreement between typical scores given by our subjects and actual general perceptions of politicians as "populists". Our subjects will rate Donald Trump and Bernie Sanders as more Sneetchy than Hillary Clinton and Jeb Bush. They will rate Nigel Farage and Jeremy Corbyn as more Sneetchy than Teresa May and Gordon Brown.

(These two are what I mean by saying that it's possible to identify "populists" and distinguish them from everyone else. Note that that's not the same as saying that doing so is actually a good idea; as I said before, one can apply the distinction even if it doesn't "matter".)

I claim, further, that our subjects' "populism" scores will not be the same as their own approval scores (e.g., both Trump-lovers and Trump-haters will agree that Trump is highly "populist") nor the same as the politicians' actual popularity (e.g., Trump is one of the most "populist" recent US presidents but one of the least popular as measured by vote fraction or approval ratings).

Which of these things, if any, do you disagree with?

Because liberté, égalité, fraternité is a left-wing, not a right-wing motto.

I'm not sure it's exactly either left-wing or right-wing. In any case, what's relevant is surely not whether one particular slogan that mentions equality is a left-wing one, but whether the left makes much more claim than the right does to identify with The Ordinary People. I don't think it does, I've said why I don't think it does, and quoting a single allegedly leftist slogan that talks about equality is in no way responsive to my reasons.

Replies from: Lumifer
comment by Lumifer · 2017-02-28T19:51:40.712Z · LW(p) · GW(p)

Thought experiment: ... Which of these things, if any, do you disagree with?

I don't know. First, the setup isn't specific enough (e.g. an awful lot depends on the formulation of your "brief description of populism"), and second I don't know. I think the outcome of this poll can go this way or that way or sideways or make a nice pirouette or something else. I am surprised you feel so certain about the outcome.

whether the left makes much more claim than the right does to identify with The Ordinary People. I don't think it does

I think it does. I think the history of the two movements makes it very clear. And notice how the left has a habit of accusing the right of being in the pocket of the rich and the powerful (in other words, elites), but the right does not accuse the left of this.

Replies from: gjm
comment by gjm · 2017-02-28T20:43:03.717Z · LW(p) · GW(p)

the setup isn't specific enough (e.g., an awful lot depends on the formulation of your "brief description of populism")

I'd have thought the rest of the thread would give a pretty good idea of the sort of thing I would put in a brief description of populism.

I think the outcome of this poll can go this way or that way

OK, so that in fact is a thing we disagree about: I would be extremely surprised to see much disagreement, and you wouldn't. I wonder why.

I think the history of the two movements makes it very clear.

I'd probably agree with you about the left and right of, say, 20 years ago. But political movements change.

the right does not accuse the left of this

What, never? Well, hardly ever.

Replies from: Lumifer
comment by Lumifer · 2017-02-28T22:00:34.149Z · LW(p) · GW(p)

a pretty good idea of the sort of thing

So you want me to guess how would you set up this hypothetical and then guess again what would be the outcome?

Is the idea that we can argue afterwards about what the outcome of that imaginary situation could be, using words like "realistic"? X-D

What, never? Well, hardly ever.

Individual politicians, sure. That's just the accusation of being a traitor to the "masses" or, alternatively, of being a LINO (Left In Name Only, see RINO and DINO). Plus, of course, you just throw all the mud you have and see what sticks :-/

But I don't recall many accusations of Democrats (or Labour) as a political movement of being just a front for the elites, other than from the certifiably extreme left.

Replies from: gjm
comment by gjm · 2017-03-01T02:19:29.464Z · LW(p) · GW(p)

So you want me to guess how you would set up this hypothetical and then guess again what would be the outcome?

Not if you think the answer is highly sensitive to the details of how you guess. I don't think it is, but evidently you do.

(Which is, as I said two comments upthread, an answer to my question of which of my guesses about the outcome of this hypothetical experiment you disagreed with.)

Individual politicians, sure. [...] But I don't recall many accusations of Democrats (or Labour) as a political movement of being just a front for the elites

Ah, I see. Yup, I'll agree that that accusation is sometimes made by lefties about the Right as a whole (or at least big chunks of it) and very rarely if at all by righties about the Left as a whole (or big chunks of it). I'm not sure this makes for an actual compelling argument -- the context, recall, was whether it's more unreasonable for lefties than for righties to complain that their political opponents are pandering to the masses instead of listening to the wisdom of the elites. Remember (not that I expect you need reminding) that "the elites" and "the rich" are not the same thing. The people some lefties are accusing some righties of not listening to are not really the same people as those some lefties are accusing some righties of being in the pockets of.

(Note, by the way, that if someone on the right accuses someone on the left -- or indeed anyone at all -- of being 'a traitor to the "masses"', then unless they're just saying "of course I have no problem with that, but you guys should" they are in fact claiming to speak for those "masses".)

Replies from: Lumifer
comment by Lumifer · 2017-03-01T16:08:26.219Z · LW(p) · GW(p)

By the way, you said that the left and the right changed over the past 20 years so that there is no "by default" association of the left with the masses any more. Why do you think so? It sounds like an unusual position to me.

Replies from: gjm
comment by gjm · 2017-03-01T16:44:23.214Z · LW(p) · GW(p)

I think the political right tries harder than it used to to appeal to those masses. The left says to the masses "Unlike those bastards on the right, we are going to look out for your economic interests". The right says to the masses "Unlike those perverts on the left, we are going to respect your values". (Of course I am caricaturing in both cases, and of course both sides say a little of both those things, and of course the Left/Right dichotomy is a simplification, yadda yadda yadda.)

So, if Team Blue claims to act in the interests of the masses and Team Red claims to share the values of the masses, which one is acting more hilariously/hypocritically if it criticizes its opponents for ignoring the opinions of the elites?

Replies from: Lumifer
comment by Lumifer · 2017-03-01T17:08:29.604Z · LW(p) · GW(p)

Well, people like Thatcher or Reagan were popular -- notably, with the masses -- and they predate the shift that you are talking about.

In the US context that would imply that during the Clinton years the Republicans decided they needed to "appeal to those masses" and the result was the success of Bush Jr. That doesn't look terribly persuasive to me -- Bush wasn't that appealing to the lower classes. The rightist rants for family values and against the degeneracy of the left were also pretty standard fare for more than a couple of decades.

In the UK context this means that after Tony Blair came to power the Tories decided they need more mass appeal and again, I don't see much evidence for this suggestion. Just like Bush, Cameron was a fairly standard conservative leader.

comment by ChristianKl · 2017-02-23T21:45:21.584Z · LW(p) · GW(p)

I do prefer Berny Sanders over Hillary Clinton and at the same time I would label Berny a populist while I wouldn't label Clinton a populist.

The opposite of people populist is being elitist.

comment by skeptical_lurker · 2017-02-22T13:28:26.291Z · LW(p) · GW(p)

The things I previously mentioned such as "Or that there is >50% probability that Brexit will literally lead to a neo nazi state in the UK within 10 years?" are mostly positions expressed by freinds. The group this person joined was advocating violent communist revolution and the murder of enemies of the people (as in it was an explicitly communist group, not a anti-Trump group that had been hijacked by communists), and so cannot be seen as a reaction to Trump or Brexit.

But, in the more general case, there are a lot of people, a lot of centeralists, who are opposed to Trump/Brexit. So people do not need to join forces with extremists to fight them.

(It's not as if the rhetoric of the less-pleasant parts of the political right is any nicer or more sensible than that of the less-pleasant parts of the political left. Intelligent educated friendly right-leaning folk can find themselves with some regrettable -- dare I say deplorable? -- bedfellows too.)

I agree with that, but I think that there is a difference in behaviour due to the fact that the left has been winning in all areas with the possible exception of economics for the last 50 years or more, but suddenly there have been some unexpected rightist victories. Firstly, this means that the left expects to be pushing back the right, and there is a general assumption that, for instance, rightists must disavow and sever all ties with white nationalists but the left can freely associate with extremists.

Secondly, given that the right has suddenly managed to win some victories, might the previous constant leftward march of history change, at least in some areas? In the same way that feminism and gay rights has made constant progress for the last 50 years, might nationalism make constant progress for the next 50 years?

I don't know how much of the left are considering that as a possibility, but I can understand that they might be terrified and lashing out while they still have the ability to.

So yes, the right are not more sensible or nicer in general, its just that right now the left have a greater ability to justify violence. If that changes, then we might live in interesting times.

Replies from: Viliam, gjm, Lumifer
comment by Viliam · 2017-02-22T17:14:36.142Z · LW(p) · GW(p)

Does it make a difference if instead of talking about "left" and "right" we focus on specific agendas?

For example, if "left" includes both "gay rights" and "killing the kulaks", then it may sound scary for a left-leaning person to say "we had 50 years of the left progress, but now we will have 50 years of the right progress", but less scary if you translate it to e.g. "we have 50 years of gay rights, but kulaks are not going to be killed at least during the next 50 years".

Yeah, this is too optimistic; I am just saying that perhaps focusing on the details may change the perspective. Maybe the historically most important outcome of the "50 years of right progress" will be e.g. banning the child genital mutilation, honor killings, and similar issues which the current left is not going to touch with a ten-foot pole (because they would involve criticizing cultural habits of other cultures, which is a taboo for the left, but the right would enjoy doing this).

I guess my point is that imagining the "right" only clicking the Undo button during the following 50 years is unnecessarily narrowing their scope of possible action. (Just like the "left" also had other things to do, besides killing the kulaks.)

Replies from: skeptical_lurker
comment by skeptical_lurker · 2017-02-22T18:07:18.749Z · LW(p) · GW(p)

I think people cluster into left and right because those are the tribes. However, it can be oversimplistic and I agree that there are many potential directions left and right progress can take - indeed, if a few more Islamic terrorists shoot up gay bars there could be a lot of LBGTs defecting to right-nationalism.

Replies from: Viliam
comment by Viliam · 2017-02-23T09:55:13.657Z · LW(p) · GW(p)

I think people cluster into left and right because those are the tribes.

Some people join the tribes because they are connected with the causes they support, but I think most people are there simply because of the other people who are there. When all your friends are X, there is a strong pressure on you to become X, too. And when people who enjoy hurting you are X, you are likely to become Y, if Y seems like the only force able to oppose X. It's like having a monkey tribe split into two subgroups; of course it makes sense to join the subgroup with your friends rather than the subgroup with your enemies. And the next step is making up the story why all good people are in your team, and all bad people are in the other team -- this signals that you have no significant conflicts in your team, and no significant friends in the other team, so you are a loyal member.

But then also words have consequences, so if your team's banner says e.g. that you should burn the witches, then sooner or later some witches are likely to get burned. Even if most people in the team are actually not happy about burning the witches, and joined merely because their friends are there. Sometimes people agree that those words about "burning witches" were meant metaphorically, not literally; but there is a certain fragility about that, because someone is likely to decide that literally burning a witch will make even stronger signal of their loyalty to the tribe.

if a few more Islamic terrorists shoot up gay bars there could be a lot of LBGTs defecting to right-nationalism

It makes me sad that the popular political positions seem to be either nationalism or cultural relativism. Is there these days even a significant pro-"Western civilization" side? I mean a side that would say that as long as you follow the rules of civilized life, your language and color of skin don't matter, but if you as much as publicly talk positively about genital mutilation or "honor" killing, no one is going to give a fuck about your cultural or religious sensitivity, you are going to be called evil.

comment by gjm · 2017-02-22T15:26:41.939Z · LW(p) · GW(p)

Well, if this person is joining an explicitly and specifically violent communist group, then I guess that indicates that this particular person is sympathetic with violent communism. That's too bad, but it's also pretty unusual and I'd classify it as "this person is broken" rather than "politics is broken" unless what you're seeing is lots of otherwise sensible people joining explicitly violent explicitly communist groups. In that case, either we've got a general resurgence of violent communism (which would be alarming) or there's something unusual about your friends (which would be interesting but not necessarily alarming).

I think you're right that the last several decades have been pretty good for progressive social causes, and that this seems like it might be changing, and that this might lead to more violence from leftists. My guess is that serious politically-motivated violence will remain rare enough that you don't actually need to worry about it unless for some reason you're a specific target, and ineffectual enough that you don't need to worry that it will have much impact beyond the violence itself.

What's there been historically? Occasional riots (usually left) and demonstrations-turned-violent (usually left, though arguably when there's been violence it's been as much due to provocation from the police as to actual violent intent by the protestors). Occasional acts of terrorism (usually right, but occasionally kinda-left as with Kaczynski). All these things are really rare, which is why they make the news, which is why it's easy to get worried about them :-). And they very rarely have any actual influence on what anyone else does.

The single most worrying political-violence-related outcome (to me) is that someone commits some act of violence and the administration uses that as a pretext for major gutting of civil liberties or something of the kind. The historical precedent I'd rather not be using explicitly is of course the Reichstag fire. [EDITED to add:] I mean specifically in the US; elsewhere in the allegedly civilized world I don't think that sort of thing is so likely.

Replies from: Lumifer, skeptical_lurker
comment by Lumifer · 2017-02-23T01:28:06.339Z · LW(p) · GW(p)

My guess is that serious politically-motivated violence will remain rare enough that you don't actually need to worry about it

That's the hope, right? We are living in a civilized society, etc. etc. There is not going to be a repeat of The Troubles, will there? No empire will collapse with a big bang, no mobs will torch the neighbourhoods, no lists of undesireables will be circulated...

What's there been historically?

Historically? During the XX century? Being on the wrong side you stood a good chance of being killed. Sent to a gulag or a concentration camp, maybe.

Replies from: gjm
comment by gjm · 2017-02-23T03:05:25.454Z · LW(p) · GW(p)

That's the hope, right?

It is something I hope, and also something I guess is probably true. Of course I could turn out to be wrong.

Being on the wrong side you stood a good chance of being killed. Sent to a gulag or a concentration camp, maybe.

Oh, for sure. But that's an entirely different failure mode from the ones skeptical_lurker appears to be concerned with.

comment by skeptical_lurker · 2017-02-22T17:47:41.893Z · LW(p) · GW(p)

I think communist beliefs, violent or not, are on the rise largely due to young angry people being too young to remember the cold war. Some friends and acquaintances from multiple disconnected freindship groups are communists, and too many of these advocate violence, although I think that they are still a tiny minority overall. I think the situation is, as you put it, "this person is broken".

I'm not at all worried about actually being the victim of politically-motivated physical violence or of riots/revolutions etc in the near future. What worries me is general political polarisation leading to a situation where blue and red tribes hate each other and cannot interact, where politics is reduced to seeing who can shout 'racist' or 'cuck' loudest. My political beliefs have become increasingly right-wing, in a classically liberal sense as opposed to fascist, and it alienates me when friends advocate burning someone's house down because they hold beliefs which are actually similar, perhaps even left of, mine. I'm not worried about them actually burning my house down, it's just alienating on principle, and for fear of social exclusion.

WRT historical periods of political instability, I agree that such periods are infrequent, and given that we have seen the results of both Nazism and communism, I think it unlikely that those ideologies will gain power. But OTOH we are going to see certain events that are totally unprecedented in history, largely because of technology. We are already seeing levels of migration that I think exceeds anything in the past (due to better transport), which is leading to a rise in nationalism, and soon it is possible that we will see far more disruptive technologies such as human genetic engineering, large numbers of jobs being automated away, mass automated surveillance, and finally FAI. If safely navigating the problems these technologies pose requires a partially political solution, then we need sane politics. And yet political discourse has sunk to the point where political candidates are debating the size of their 'hands' and whether frogs are racist. Obama's advisor seemed to think that the danger of AGI is that it might be programmed by white male autists.

We do not have the level of political sanity necessary to deal with disruptive technologies and its getting worse. Nick Bostrom thinks that genetically engineered IQ boosts of 100 points+ in a single generation might be possible, and soon. Nazism and communism are unlikely now, but how would society react to human genetically engineering? Many would try to ban it. Some would try to tax it. Countries where it was illegal might suffer massively reduced economic growth compared to those where it was allowed. Inequality might skyrocket. I'm not trying to suggest that we will specifically end up with 'Gattaca' or 'Deux Ex: Mankind divided' or any of the other specific science fiction explorations of these possibilities, I'm saying that I don't know what will happen and political extremism/violence is certainly a possibility and it doesn't help if extremism is increasing anyway!

Replies from: Lumifer, gjm
comment by Lumifer · 2017-02-23T01:29:50.866Z · LW(p) · GW(p)

We do not have the level of political sanity necessary to deal with disruptive technologies

We never had and yet we all are here.

Replies from: satt
comment by satt · 2017-02-25T15:51:15.856Z · LW(p) · GW(p)

Dat anthropic bias tho!

Replies from: Lumifer
comment by Lumifer · 2017-02-25T17:15:52.454Z · LW(p) · GW(p)

Good point.

comment by gjm · 2017-02-22T22:47:33.484Z · LW(p) · GW(p)

My political beliefs have become increasingly right-wing, in a classically liberal sense as opposed to fascist, and it alienates me when friends advocate burning someone's house down because they hold beliefs which are actually similar, perhaps even left of, mine.

The impression I have -- though of course I don't know what your friends have been saying -- is that the burn-their-houses-down brigade are much more upset about the kinda-fascist sort of right than the kinda-libertarian sort of right. Of course even if I'm right about that that doesn't necessarily reduce the sense of alienation; your aliefs needn't match your beliefs.

We do not have the level of political sanity necessary to deal with disruptive technologies and it's getting worse.

Agree about first half; not fully convinced about second half. As you pointed out yourself, it's not that long ago that we had actual Nazis and Stalinists in power in Europe, and bad though early-21st-century politics is it doesn't seem like it's got there just yet. People have said horrible things about Donald Trump and Hillary Clinton, but ten years ago they were saying similarly horrible things about George W Bush and, er, Hillary Clinton. But yeah, I can't see our existing political institutions coping very well with immortality or super-effective genetic engineering or superintelligent AI, should those happen to come along.

Replies from: skeptical_lurker
comment by skeptical_lurker · 2017-02-23T00:25:25.871Z · LW(p) · GW(p)

The impression I have -- though of course I don't know what your friends have been saying -- is that the burn-their-houses-down brigade are much more upset about the kinda-fascist sort of right than the kinda-libertarian sort of right. Of course even if I'm right about that that doesn't necessarily reduce the sense of alienation; your aliefs needn't match your beliefs.

Except that I don't think libertarian is incompatible with boarder controls - indeed, libertarians are generally enthusiastic about property rights, and controlling immigration is no different to locking your front door and vetting potential housemates.

I'm not saying that the boarder controls should be based around skin colour, but the definition of 'Nazi' seems to have expanded to anyone who believes in any form of boarder control.

Agree about first half; not fully convinced about second half. As you pointed out yourself, it's not that long ago that we had actual Nazis and Stalinists in power in Europe, and bad though early-21st-century politics is it doesn't seem like it's got there just yet.

I certainly agree that globally its not as bad as 1930-1990. Nevertheless, things seem to have got dramatically worse in the last decade - in my personal experience it used to be that people could agree to disagree, now most political opinions seem to be in lockstep, almost like a cult. More generally, I remember people criticising Bush, but now there are very intelligent people, even the head of CFAR, saying that Trump could be the end of democracy. Either they are correct, in which case that is obviously a cause for concern, or they are wrong and a lot of very smart people, inc rationalists, are utterly mindkilled.

Replies from: gjm, metatroll, bogus
comment by gjm · 2017-02-23T03:02:55.336Z · LW(p) · GW(p)

the definition of 'Nazi' seems to have expanded to anyone who believes in any form of border control.

For what it's worth, I haven't seen the word used that way. But -- the standard disclaimer -- my left-leaning Facebook friends are not your left-leaning Facebook friends, unless there's some purely coincidental overlap, and yours may be more Nazi-accusation-happier than mine.

Either they are correct, in which case that is obviously a cause for concern, or they are wrong and a lot of very smart people [...] are utterly mindkilled.

Or both, of course :-). More seriously, I think your observations are adequately explained by the hypothesis that (1) Trump and his administration are much more unusual than Bush and his administration, (2) they are in fact distinctly more likely than Bush was (though still not very likely) to do serious damage to the US's democratic institutions, and (3) a lot of very smart people are somewhat mindkilled. I think #1 is obviously true, #2 is probably true, and #3 would be entirely unsurprising (much less surprising than all those people being utterly mindkilled).

Incidentally, I do remember some not-otherwise-obviously-crazy people speculating that Bush would simply refuse to leave office after 8 years and that somehow the Republican-controlled Congress would help make it so. So end-of-democracy hysteria isn't so very new.

Replies from: bogus
comment by bogus · 2017-02-23T03:15:21.500Z · LW(p) · GW(p)

speculating that Bush would simply refuse to leave office after 8 years

You mean like FDR actually did? Except that he wasn't a Republican.

Replies from: gjm
comment by gjm · 2017-02-23T11:56:50.752Z · LW(p) · GW(p)

Er, FDR was elected for a third term before there were term limits for US presidents. The (stupid and not widespread) speculation was that GWB would cling to power by some means less legitimate than that.

comment by metatroll · 2017-02-23T02:06:11.887Z · LW(p) · GW(p)

Putin's a mindkiller.

comment by bogus · 2017-02-23T03:06:35.457Z · LW(p) · GW(p)

I remember people criticising Bush, but now there are very intelligent people, even the head of CFAR, saying that Trump could be the end of democracy.

Trump is the end of democracy-as-we-know-it, and both sides of the political spectrum agree that this is the case, albeit for very different reasons. But the United States were never founded as a democracy in the first place; they're supposed to be a federated republic, with plenty of checks-and-balances as an integral part of the overall arrangement. If our Constitution is worth more than the paper it's printed on, we'll find ourselves right back in what used to be the status quo.

comment by Lumifer · 2017-02-23T01:20:33.117Z · LW(p) · GW(p)

are mostly positions expressed by freinds

You should find better friends.

but I can understand that they might be terrified and lashing out while they still have the ability to.

Terrified. What exactly are they terrified of? That their favourite political positions are not going to be held by people in power? In the West that's hardly grounds for terror.

comment by satt · 2017-02-22T00:14:44.065Z · LW(p) · GW(p)

It makes me paranoid and alienated if people I know join facebook groups that advocate political violence/murder/killing all the kulaks, although to be fair its possible that those people have only read one or two posts and missed the violent ones.

Does it help to disaggregate "political violence", political "murder", and "killing all the kulaks"? I'm happy with some instances of political violence, and even some political murders are defensible. The assassination of Jonas Savimbi pretty much ended Angola's 26-year civil war, for example. To quote Madeleine Albright: worth it.

If the people you know are thumbs-upping literally "kill all the kulaks" (and maybe they are! I'm sure I've seen that kind of stuff in YouTube comments and Stalinist tweets, so it is out there), I can understand your reaction. But if people are merely affirming that some political violence is worthy of support...well, I'd have to say that I agree!

Replies from: skeptical_lurker
comment by skeptical_lurker · 2017-02-22T13:00:28.247Z · LW(p) · GW(p)

Well, in one comment a friend was advocating violence against perhaps the most right wing 10-15% of the population.

Replies from: satt
comment by satt · 2017-02-22T19:29:03.690Z · LW(p) · GW(p)

!

That clarifies things somewhat.

Replies from: skeptical_lurker
comment by skeptical_lurker · 2017-02-22T20:53:48.600Z · LW(p) · GW(p)

You see, its one thing to advocate violence against a literal Neo-Nazi, but advocating violence against anyone who advocates reducing immigration, well, that shows a lot more liberal tribe loyalty. So much holier than thou.

Additionally, this comment was made IRL, possibly within earshot a person they were advocating violence against.

comment by Gram_Stone · 2017-02-21T22:57:57.520Z · LW(p) · GW(p)

Why do you mourn when you can contemplate politics no more? What makes you think about it so much in the first place? That just seems like something you wouldn't want to ignore.

comment by skeptical_lurker · 2017-02-22T12:58:33.524Z · LW(p) · GW(p)

No, but I'm not under the illusion that I can currently make any significant contribution to changing politics - its certainly not my area of comparative advantage, but I could at least leave the country if things did start to get that bad. There would be fairly obvious warning signs that would not require a close watch on current events.

comment by gjm · 2017-02-24T15:56:40.742Z · LW(p) · GW(p)

Were there ever really

"experts" who were supposed to screen newcomers for those who wouldn't follow the rules of civilized life

?

why that particular criterion?

I don't see a particular criterion in Viliam's comment; I see a couple of examples of things that we might want not to tolerate.

aren't you throwing out another western value, free speech, with it?

Doesn't look like it to me. Viliam says: if you talk positively about X and Y, you will be called evil. That's not at all the same as saying you're forbidden to talk positively about X and Y. The bargain of free speech has always been this: you're allowed to say "yay for Hitler" or "yay for Stalin" or whatever, and everyone else who hears this is allowed to say "you're an idiot or an asshole".

Replies from: Viliam
comment by Viliam · 2017-02-24T16:34:01.008Z · LW(p) · GW(p)

Yes, there should be a gap between "what is legal" and "what is socially approved", and promoting uncivilized ways of life should be in that gap. Enough free speech to allow it, enough common sense to disapprove of it socialy.

Some people will always enjoy walking exactly on the line; making the legal line same as the socially approved line makes things worse. If they are separated, then if someone walks exactly on the legal line, flip a coin, and either put them to a jail or not, but no one is going to complain about the jail if that happens to be the outcome. And if someone walks exactly on the decency line, flip a coin, and either stop inviting them for a dinner or don't, but either way the law is not involved. It's just when the two lines happen to be the same, you have to flip two coins at the same time, and sometimes put someone into a jail for a behavior that is perceived as okay.

Replies from: Lumifer
comment by Lumifer · 2017-02-24T16:59:37.459Z · LW(p) · GW(p)

Yes, there should be a gap between "what is legal" and "what is socially approved", and promoting uncivilized ways of life should be in that gap.

The problem is that "socially approved" is a function of the society and there are a lot of those. "Socially approved" in Black Rock City means something very different from "socially approved" in Salt Lake City.

Replies from: Viliam
comment by Viliam · 2017-02-27T11:58:14.894Z · LW(p) · GW(p)

I guess the mass media used to synchronize the society at least approximately (what behavior is portrayed as "socially approved" in the soap operas), but that mechanism may be dead these days.

Replies from: Lumifer
comment by Lumifer · 2017-02-27T15:35:33.638Z · LW(p) · GW(p)

Interesting. This implies that there is/was a historically short period of time when mass media was able to sync up most everyone. Before that time societies were stratified (e.g. by classes, see feudalism, each with quite different "socially approved" standards) and after that time societies are re-fragmenting into small pieces/bubbles.

Replies from: Viliam
comment by Viliam · 2017-02-28T15:52:42.796Z · LW(p) · GW(p)

The ultimate Schelling point of human culture -- Hollywood.

Replies from: Lumifer
comment by Lumifer · 2017-02-28T19:55:27.861Z · LW(p) · GW(p)

A temporary Schelling point.

The ultimates, being biologically hardwired, tend to stay the same: food, warmth, sex, company.

comment by [deleted] · 2017-02-22T19:49:51.391Z · LW(p) · GW(p)

Imagine that a completely trustworthy person who knows all your beliefs has acquired information that will "radically alter your worldview." No further details of the information are given. How much would you pay for it? [pollid:1198]

Replies from: Dagon, drethelin, entirelyuseless
comment by Dagon · 2017-02-22T23:41:11.251Z · LW(p) · GW(p)

Unpack "trustworthy" - does this mean the person isn't going to tell falsehoods, but may not actually understand how truth works? Or is this more like Omega - has special access to data?

Replies from: None, ChristianKl
comment by [deleted] · 2017-02-23T12:04:55.575Z · LW(p) · GW(p)

The person doesn't tell lies and you trust his/her intelligence and access to information.

Replies from: Dagon
comment by Dagon · 2017-02-23T14:36:05.824Z · LW(p) · GW(p)

But otherwise, the person has non-exceptional access to and discernment of truth? So it's likely that anything truly unusual he believes is wrong. I don't think Bayes will let me update all that far from "whatever he says is filtered through an engine not optimized for truth." Anything that he thinks will "radically alter my worldview" is likely an illusion or something I already have some evidence for.

This changes in cases where I think the person DOES have better-than-average access to truth.

Also, the fact that he's offering to sell me information that will change my worldview very much works against my likelihood to believe what he says.

Replies from: None
comment by [deleted] · 2017-02-23T16:36:57.590Z · LW(p) · GW(p)

You are fighting the hypothetical.

A person has true information that will "radically alter your worldview". Assume you believe him/her. How much would you pay for the information?

Replies from: Luke_A_Somers, Dagon
comment by Luke_A_Somers · 2017-02-27T20:15:13.317Z · LW(p) · GW(p)

Seems more like trying to clarify the hypothetical. There's a genuine dependency here.

comment by Dagon · 2017-02-24T03:15:01.967Z · LW(p) · GW(p)

You are fighting the hypothetical.

Yeah, I tend to do that. However, this is the first that you've asserted that it's true information, which is an important clarification. I'm willing to pay a significant amount for true information that will let me make a large update (which is how I interpret "radically alter worldview").

comment by ChristianKl · 2017-02-23T07:48:14.808Z · LW(p) · GW(p)

I read it as a person who generally has a good track record and who build a reputation with being right when he makes these kind of claims.

Maybe someone who already has done this intervention a few times and who uses the principles of http://lesswrong.com/r/discussion/lw/oe0/predictionbased_medicine_pbm/ and can tell you that with 90% credence you will afterwards say that he radically changed your mind.

comment by drethelin · 2017-02-24T21:25:44.140Z · LW(p) · GW(p)

If all the parts of this hold true, then person knows me well enough to know how important it would be to me and to the world to change my worldview. If they're not already telling me without payment, I can conclude that it wouldn't have much practical impact and be something like "The Earth is a Simulation but we don't know anything about how it works beyond physics or who made it, but the proof is convincing." Given that, I would probably pay a small amount of curiosity but not more.

Replies from: ChristianKl
comment by ChristianKl · 2017-02-27T09:00:26.277Z · LW(p) · GW(p)

Sharing the information might have a cost for the other person that lead to it not being shared without payment.

There's also the element that you take information a lot more seriously when you paid money for it.

comment by entirelyuseless · 2017-02-24T05:37:12.008Z · LW(p) · GW(p)

If I know that what they are saying is true, I will already radically alter my worldview, by dividing up my probability estimate among the alternate possibilities that I think are most likely to be true.

comment by gjm · 2017-02-23T11:59:35.121Z · LW(p) · GW(p)

Perhaps we're at cross purposes. I didn't call it "hypothetical violence" because I think no one on the left has ever been violent for political reasons, any more than I talked about a "hypothetical person" because I don't think persons are real or talked about "hypothetical leftist content" because I don't think there's any left-wing stuff on Facebook. I called it "hypothetical violence" because this is a purely hypothetical scenario and therefore questions about what will or would happen don't have definite answers. ("Hypothetical" does not mean "nothing matching this description has ever happened".)

comment by ChristianKl · 2017-02-23T07:50:55.002Z · LW(p) · GW(p)

Not necessarily. Sunni's might believe that siding with other Sunni's is a good idea because they expect to get treated better than Shia. Signaling the tribal loyalty might be more central than anything substantive about Islam.

comment by ChristianKl · 2017-02-22T23:11:03.054Z · LW(p) · GW(p)

To be fair to the people arguing against this, I suspect they're using a somewhat non-standard definition of "motivated by".

You can tell a story about how the old generals of the Iraqi army were out of work and wanted to regain political power and used the banner of Islam as tool. Not because they are honest believers but because it was the best move to gain political power.

That story uses the standard definition of "motivated by" but I don't think it's a full representation of what happened.

comment by gjm · 2017-02-22T15:46:58.775Z · LW(p) · GW(p)

This is a useful general prescription against irrationality: if a belief is supported by reason and evidence then you should be able to say what evidence would make you revise it. But it's worth noting that sometimes a belief may be reasonable but really hard to imagine remotely plausible evidence that would change your mind about it. What would Donald Trump have to do that would make you think he's a progressive internationalist who favours open borders and free trade? What would ISIS have to do to convince you that they are primarily an organization dedicated to fostering peace and cooperation among people of different religions?

Clearly it's not actually unreasonable to think that Donald Trump isn't keen on open borders and free trade, or that ISIS aren't particularly into peace and cooperation. But the question you should be able to answer to justify a claim that you believe those things rationally is, I suggest, not so much "what evidence would change your mind?" but "what different evidence would have led to a different conclusion?". If Donald Trump had campaigned on promises to lower tariffs and offer amnesties to illegal immigrants, or if ISIS gave out pamphlets about peace and love and charity instead of blowing things up, I'd have different opinions about them. (Though I'd probably still mistrust both.)

So if someone can't tell you what Donald Trump could do to convince them he's a pathological liar, rather than writing them off you might instead ask them "well, then what could he have done differently that would have led you to think of him that way?".

(Where you think they're not only wrong but obviously wrong, you might reasonably take the view that anyone who thinks the evidence is so one-sided that it would take an impossible amount of future evidence to change their mind is ipso facto probably nuts. So you might write them off after all, if you think no remotely reasonable person could think the evidence overwhelmingly favours Trump not being a pathological liar.)

Replies from: chaosmage, ChristianKl
comment by chaosmage · 2017-02-23T10:36:31.556Z · LW(p) · GW(p)

you might reasonably take the view that anyone who thinks the evidence is so one-sided that it would take an impossible amount of future evidence to change their mind is ipso facto probably nuts. So you might write them off after all

I agree that would be the sensible response, but I'm curious for ways to engage with people who see the world radically differently.

An ability to build particularly long bridges of consensus across particularly wide chasms of preconceptions could do the world a lot of good, if it is a learnable and teachable skill.

comment by ChristianKl · 2017-02-22T16:27:56.573Z · LW(p) · GW(p)

Donald Trump could hold make increase the amount of green cards that the US hands out to skilled workers. Nixon went to China and if Trump acts in a way that actually furthers immigration and that reduces the total tariff burdens I'm open to accepting him as a progressive internationalist.

Replies from: gjm
comment by gjm · 2017-02-22T22:37:35.702Z · LW(p) · GW(p)

Personally I'd want more evidence than that. I think Trump-as-progressive-internationalist is more or less on the borderline of things that it's not crazy to imagine finding sufficient evidence to believe. (ISIS-as-peaceful-organization is well beyond it -- I guess the most plausible way to get such evidence would be for it to turn out that we in the west have been systematically deceived about ISIS to such an extent that our ideas about it are almost entirely wrong, but I think I'd describe that situation as "turns out ISIS, as we believed in it, was a made-up organization, and there happens to be another entirely different one that shares its name" rather than "ISIS turns out to be peaceful".)

comment by gjm · 2017-02-22T11:10:14.414Z · LW(p) · GW(p)

will he be willing to criticize that stuff when it actually gets violent? Remember, it can be dangerous [...]

Dunno. The obvious guess would be "not willing to do it in public with the violent people watching, willing to do it when safe from reprisals", which coincidentally is more or less exactly what I would guess if he weren't a member of our hypothetical unkind-turning-violent Facebook group.

(Given that we're talking about a hypothetical person joining a hypothetical Facebook with hypothetical leftist content, in the event that its hypothetical extreme language hypothetically gives way to hypothetical violence, I don't think we can do better than guesses. There isn't enough specificity for more.)

Replies from: Lumifer
comment by Lumifer · 2017-02-23T01:31:22.646Z · LW(p) · GW(p)

“The only thing necessary for the triumph of evil is for good men to do nothing.”

Replies from: gjm
comment by gjm · 2017-02-23T03:07:10.175Z · LW(p) · GW(p)

Yeah, that's not actually true. It also requires evil men (or women, but for whatever reason it usually seems to be mostly men) to do something, and personally I am more inclined to blame them for it.

Anyway, I'm not sure what your point is. That our hypothetical reasonable leftie who puts up with extremist talk and maybe, later, violence from not-so-reasonable lefties hasn't acted optimally for the general good? Sure, I agree. What of it?

Replies from: Lumifer, Viliam
comment by Lumifer · 2017-02-23T16:50:54.973Z · LW(p) · GW(p)

We are not talking about blame. We are talking about cause-effect relationships.

My comment was more a quip and less a point, but I'm curious about LW's thoughts about the degree to which passivity absolves you from responsibility :-/

Replies from: gjm, gjm
comment by gjm · 2017-02-23T19:44:23.629Z · LW(p) · GW(p)

curious about LW's thoughts about the degree to which passivity absolves you from responsibility

I don't think passivity as such has much to do with it. I think responsibility is diluted when the thing you did-or-didn't-do had its effect only by many thousands of other people likewise doing-or-not-doing it. If A threatens to assassinate the President and B1...B1000 all fail to report him to the FBI despite seeing the threat, and then he does it -- well, then B1 through B1000 all bear some responsibility, but I suggest at most about 0.1% as much as A does. I'm inclined to think rather less than 0.1% as much.

(But of course how much responsibility should be assigned to any given person for any given thing is a complicated question in complicated cases, and surely there's no One True Right Answer.)

Replies from: Lumifer
comment by Lumifer · 2017-02-24T16:26:34.713Z · LW(p) · GW(p)

I'm not thinking of things like reporting a possible assassin to the FBI, I'm thinking of things like living in a country with a, let's say, morally reprehensible leadership. Say, Stalin's Russia or Hitler's Germany. And you're a regular person, you just go to your job every day, you don't shoot anyone or personally interrogate enemies of the state. Of course, you do go to the party meetings, but then everyone does. To what degree are you complicit in the doings of your state?

I do not imply that there is One True Right Answer.

comment by gjm · 2017-02-23T19:39:59.387Z · LW(p) · GW(p)

My comment was more a quip and less a point

Sirrah, you astonish me.

Replies from: Lumifer
comment by Lumifer · 2017-02-24T16:27:03.825Z · LW(p) · GW(p)

Sometimes I astonish myself.

comment by Viliam · 2017-02-23T09:36:28.477Z · LW(p) · GW(p)

Technically, a certain fraction of population is born as psychopaths. Sure, we can blame them, but we shouldn't act surprised by their existence. In some sense it is probably good to think about them similarly as we think about natural disasters (if natural disasters would be endowed with human-level intelligence, e.g. a lightning bolt would first explore the environment, and then hit exactly the least protected place to cause maximal damage).

Apologies for being off-topic, but it seems to me that this is a frequently underestimated thing.

comment by MrMind · 2017-02-21T08:26:07.100Z · LW(p) · GW(p)

It's very funny that I got spam from this site soliciting me in investing money for a church, and the prerequisite is "you must have the fear of God". Please ban the user kings11me.

Replies from: Elo
comment by Elo · 2017-02-23T06:32:01.394Z · LW(p) · GW(p)

Already done, thanks to those who reported it..

comment by ChristianKl · 2017-02-23T07:52:56.815Z · LW(p) · GW(p)

That's simple, engage in pathological lying.

How do you distinguish lying that's pathological from lying that isn't?

comment by Thomas · 2017-02-20T08:46:00.190Z · LW(p) · GW(p)

A math problem

https://protokol2020.wordpress.com/2017/02/20/landaus-problem/

This one is a real one, but somewhat transformed and potentially solvable.

comment by Good_Burning_Plastic · 2017-02-28T09:26:03.345Z · LW(p) · GW(p)

I assume you mean differences between masses, not squared masses,

No I don't

as a little dimensional analysis show suggest.

It's the difference between squared masses divided by the energy.

comment by Good_Burning_Plastic · 2017-02-28T03:00:52.225Z · LW(p) · GW(p)

Think about it this way, take a theory where the neutrino's mass is ε for arbitrary small ε and take the limit as ε approaches 0.

Then all other things being equal the length the neutrino needs to travel in order to oscillate to a different flavor approaches infinity.

(More accurately, oscillation lengths are inversely proportional to the differences between squared masses of neutrino mass eigenstates. So you can't set a lower bound to the mass of the lightest eigenstate, but you can set a lower bound to the masses of the two other eigenstates. (Each of the three neutrino flavors is a different superposition of the three neutrino mass eigenstates.)

comment by Luke_A_Somers · 2017-02-27T20:12:04.067Z · LW(p) · GW(p)

Better answer: they would need to demonstrate experiencing subjective time, such as by flavor-oscillating.

Which they do.

Which is why we think they have mass.

comment by ingive · 2017-02-24T04:01:33.163Z · LW(p) · GW(p)

There is a lot of wishing with what I wish for the world, so then I understand that the best option for me is to figure out the best course action over my lifetime, as that's what I have access to (indirectly via bandwidth to a keyboard-computer-internet-etc-you) but at the same time disconnecting from this belief. Because wishing isn't the best option, neither is the best course of action. Realizing that it's useful practically sometimes to attach to thinking, but not for the majority of the time. (p.s I made an excuse for my attachment to my thinking lol)

The best course of action is being in a state of flow constantly, which means inhibiting subvocalization and past-future thinking. Because the left hemisphere gets in the way of the action I am taking. For example, if I do a round of dual n-back, if I start actively thinking, my score drops. However, if I enter into a state of flow or focusing and thus inhibit the default mode network, it seems much better.

Now, I think that rationality suits as the best action, while at the same time not being attached to rationality and embrace the right hemisphere. Of course, from inside-out the brain knowing of right hemispheres will teach you nothing. But it acts as a guide to get you to spend time being in a state of flow to get 'nowhere' as naturally, you will become better at it.

What do I wish for rationalists and myself? I wish that we naturally tip the balance to the flow state but decide our actions with rationality. How this is expected to work is as following, every human being on this planet should go towards persistent non-symbolic experiences. While at the same time building applications, for example, everyone's phone for example which will act as a reminder and memory tool. The connection between the phone and the brain can simply be a wireless earpiece wore at all times. Maybe with some weak-AI system as well. Doesn't have to be an AGI. So we already use and have this technology.

I'm infinitely certain that this is our purpose on planet earth.

Replies from: gjm
comment by gjm · 2017-02-24T15:52:00.782Z · LW(p) · GW(p)

In case anyone's wondering what we lost by turning off downvotes: We lost the ability to downvote this sort of stuff into oblivion.

(Still a net-positive tradeoff, I think, but certainly far from cost-free.)

Replies from: ingive
comment by ingive · 2017-02-25T08:07:52.942Z · LW(p) · GW(p)

You seem to imply that my comment is a cost but not to which extent. I acknowledge that I am not a writer which is able to facilitate this to you in the LW-lingo and better English. But, it also matters to a cost to what and benefit to who? I'm not writing with my brain wired from the perspective of the community of lesswrong. But, frankly, I have seen it very strongly in its users like you. It might seem like I am confronting you but then I offer you the opportunity to see it in another way.

The way which is bigger than all of us and epistemic/instrumental rationality combined.

I'm not sure what's the problem anyway if you can say what is. I wish you would argue against me so I can better explain my point. :)

Peace and stillness my man. I appreciate y'all.

Replies from: Luke_A_Somers
comment by Luke_A_Somers · 2017-02-27T20:44:01.498Z · LW(p) · GW(p)

Your first sentence, for example, has a lot of parts, and uses terms in unusual ways, and there are multiple possible interpretations of several parts. The end effect is that we don't know what you're saying.

I suspect that what you're saying could make sense if presented more clearly, and it would not seem deep or mysterious. This would be a feature, not a bug.

Replies from: ingive
comment by ingive · 2017-02-28T00:51:44.378Z · LW(p) · GW(p)

My wishing for the world is intellectual masturbation, so my practical actions in this consensus reality matter the most (instrumental rationality). But if thinking stops (epistemic rationality by persistent non-symbolic experiences) I do not care in a sense, I go insane in relation to the consensus reality but sane to the non-symbolic way of being.

So the way to solve this is to have a good system to remember me of my chores, goals, and choices which we would call rationality in the consensus reality. Otherwise, I might simply no longer be efficient from what I learn of the consensus reality. My memory might even be impaired.

Some think that the way for us to return to these states is by AGI and simply overcoming the limits of the human brain, but humans have done it for thousands of years, possibly with more ease.

See this article, Ben Goertzel is doing the interview: http://hplusmagazine.com/2012/08/08/engineering-enlightenment-part-one/

So what I think that I want is a persistent non-symbolic state, symbols make no sense, it's a bit Orwellian. But empirical feeling, indiscriminate love and so on makes a lot of sense. Of course, everything will function as it used to be ('I'-thought have never existed in the first place), but it will still be different. But from the place I am, I need (and I think humanity) need some system in which the computer keeps a track of what my goals and so on were before the persistent non-symbolic state.

This beautifully falls into a nice merging with machines, I think, let that which is unconscious, and always will be (machines), be our thinking, for we are non-symbolic I think. :)

Replies from: Strangeattractor
comment by Strangeattractor · 2017-02-28T08:47:56.587Z · LW(p) · GW(p)

You say "intellectual masturbation" like it's a bad thing. :)

comment by Elo · 2017-02-23T19:45:56.152Z · LW(p) · GW(p)

I think this is a failure of question. We should be asking for concrete evidence of the event. For example if we smash neutrinos into our sensors they register as having a mass by interacting with other mass holding particles

comment by Lumifer · 2017-02-23T16:59:21.404Z · LW(p) · GW(p)

Up LW's alley: An API that makes it easier to host better conversations plus the HN discussion.

Replies from: gjm
comment by gjm · 2017-02-23T19:47:04.264Z · LW(p) · GW(p)

Until the evidence is stronger, I might suggest "that allegedly makes it easier" or "that hopefully makes it easier" or something of the kind.

Replies from: Lumifer
comment by Lumifer · 2017-02-24T16:22:57.386Z · LW(p) · GW(p)

I just reposted the HN title, it's not my conclusion. I think HN discovered it's mostly a profanity filter, anyway :-/

comment by HungryHippo · 2017-02-23T14:04:15.290Z · LW(p) · GW(p)

When I listend to his AMA, I noticed this line as well. It's a really clever "tool for thinking" that deserves to be noticed.

There's an interview with Dawkins somewhere where he mentions an anecdote about Wittgenstein. Wittgenstein is supposed to have said "Why did people ever believe that the sun revolves around the earth?", and his interlocutor supposedly answered: "Well, obviously it's because it looks like the sun is revolving around the earth." Then Wittgenstein whips out the counterfactual: "Well, what would it have looked like if it looked like the earth revolves around the sun?".

And the answer is obviously: exactly the same, lol!

Replies from: fubarobfusco
comment by fubarobfusco · 2017-02-23T16:35:58.872Z · LW(p) · GW(p)

So what was the wrong idea "geocentrism" about, then?

Some tribal lore tells us that it had to do with the centrality of humanity in God's plan; or the qualitative difference between earthly and celestial things: the sun, moon, and stars belong to the heavens; the earth is below them; and hell is under the earth.

But maybe it's more to do with a wrong idea of "revolving" instead. The ancients had no concept of freefall. When they imagined an object revolving around another, they may have imagined a sling-stone being swung in a sling. "If the earth were swinging around the sun, surely we would fall off!" The earth has discernible features such as oceans, trees, and people which might "fall off" under motion, but the sun doesn't, being a seemingly featureless body of light: so the evidence of ordinary terrestrial experience favors the stability of the earth and the motion of the sun.

Even after heliocentric cosmology, it took more than a century to come up with the unification of celestial and terrestrial gravity: that the same rules govern the motion of the planets and moons that also govern cannonballs.

comment by MaryCh · 2017-02-23T09:22:35.319Z · LW(p) · GW(p)

According to chronicles1, the Volkhov river in the town of Novgorod sometimes flowed back. It happened in 1063 (5 days), 1415 (not stated for how long, but 'Volkhov and many other rivers' are said to have done that), 1461 (3 days), 1468 ('The whole summer the river Volkhov upwards flowed for four days', not sure how to read this at all), and 1525 (9 days, 'not by wind, neither by storm, but by the order of its creator the God').

1compiled in Е. П. Борисенков, В. М. Пасецкий. Тысячелетняя летопись необычайных явлений природы. 1988. (A thousand-year-long chronicle of astonishing natural phenomena).

comment by username2 · 2017-02-22T10:55:47.107Z · LW(p) · GW(p)

Unfortunately life is too short to criticise everyone who is wrong on the internet.

comment by sone3d · 2017-02-21T18:27:17.504Z · LW(p) · GW(p)

[Excuse me. Not native english speaker]

First of all, Lesswrong is the site where I always put my lasts hope.

I'm having lots of troubles trying to find the truth about IQ test taken by different ethnic groups It seems there are lots of studies claiming differences in IQs. On the other side there is a lot of people saying the contrary (culturally biased tests,..)

What do we do as rationalists? I'm really confused. I have read tons of articles from both sides yet nothing is clear to me.

Replies from: Viliam, Viliam, ChristianKl, Elo
comment by Viliam · 2017-02-22T13:37:19.925Z · LW(p) · GW(p)

Speaking for myself, my position is "I don't know".

Ignoring the specific question, there are many situations in my life where (a) I am curious about something, (b) I don't trust the existing research, and (c) it is not high enough priority for me to try doing the research myself. In such case, thinking "I don't know" seems like a reasonable reaction. What else should I think?

In absence of solid research, people often return to armchair reasoning, inventing clever arguments why in absence of evidence we should stick with "default" opinion X, and put the whole burden of proof on people who say Y. Problem is, in the next room, people use similar armchair reasoning to argue that we should stick with the "default" opinion Y, and put the whole burden of proof on people who say X. I could easily provide "a priori" arguments for either position here, which is why I consider neither of them convincing.

Here are your options:

  • decide that you feel better about believing that the different ethnic groups have the same average IQs;
  • decide that you feel better about believing that the different ethnic groups have different average IQs;
  • do the research, which can take you a few years and probably more money than you have; or
  • accept that you do not know the answer, and learn to live with that feeling.

More importantly, the question is what purpose do you actually need the answer for.

Are you trying to find intelligent people? Then try to measure or estimate their IQs as individuals.

Are you deciding which country to live in? There are probably many other factors to consider, so use those.

Are you trying to win an internet debate? Consider better ways to spend your time.

Are you researching a transhumanist technology for creating superhumans? You will have to research the specific alleles and their interactions; knowing about averages of large populations is not going to help much.

EDIT:

I think there may be one situation where knowing the answer actually would be practical: If you want to start a project in a foreign country, and the success of the project depends on the country having enough high-IQ people. For example, you might be a philanthropist billionaire trying to build a university (or, more meaningfully, the whole educational structure, starting perhaps from pre-school education, and ending with a university) in the middle of a country that according to some sources is full of people stupid for genetical reasons (i.e. not just bad nutrition, etc.), so your project of achieving university-level education for local people may be doomed to fail.

I guess you'd just have to take the risk, and possibly find the answer to the question as a side effect.

comment by Viliam · 2017-02-23T10:50:58.103Z · LW(p) · GW(p)

One more thing to consider is that IQ is caused partially genetically, and partially non-genetically, e.g. diseases or lack of nutrition decrease IQ. So if you e.g. examine people from a sick and starving population, of course they are likely to have below-average IQ. But that doesn't say anything about what IQ their descendants will have if the food and health problem gets fixed.

Intelligence is a polygenic trait, i.e. a trait influenced by multiple genes. There is an observed regression to the mean, that is although smart parents are likely to have smart children, and dumb parents are likely to have dumb children, the children of either are usually closer to the average than their parents.

In other words, if you would inhabit an island exclusively by Mensa members, the children born on this island would probably almost all have above-average intelligence; but many of them would not reach the Mensa level. Or the opposite experiment... well, this one was actually done in real life... 40-50 years ago when communists ruled Cambodia, they killed almost all literate people in the country in the attempt to create an agrarian utopia (spoiler: didn't work as advertised), but Cambodia didn't literally become a nation of retards.

It is difficult to find exactly which genes contribute to IQ. There are more than 50 suspects, but the experiments suffer from low sample sizes, so many of them are probably false positives.

It seems that first-born children have higher IQ than their siblings. (The official story seems to be that it's because they get more parental attention and resources. To me it seems more likely that children born later simply receive higher mutational load from older parents. But maybe it's both.)

It was suspected that breastfeeding increases IQ. Then it turned out this correlation was caused indirectly by mother's IQ; i.e. smart mothers are more likely to breastfeed their children, and smart mothers are likely to have smart children, which creates a correlation between smart children and breastfeeding even if breastfeeding has no real effect on IQ. Later, there was an experiment showing that actually children with some genes may benefit from breastfeeding while children with other genes may not... but the experiment is unrealiable because of small sample size. (If you start researching this topic seriously, you are going to hear "small sample size" depressingly often.) If true, this wouldn't be completely suprising, because similar effects are known; for example people with phenylketonuria are more likely to become retarded unless they get a special diet in which case there is no impact on IQ.

tl;dr -- it's complicated; people who pretend it isn't are either stupid or lying, don't be one of them

comment by ChristianKl · 2017-02-21T19:34:16.606Z · LW(p) · GW(p)

"What do we do" depends largely on the action that you are thinking about. What kind of decision do you want to make that's effected by the knowledge?

Replies from: sone3d
comment by sone3d · 2017-02-21T19:48:21.662Z · LW(p) · GW(p)

Sorry, I didn't expressed correctly. What I'm asking is "what should I believe"?

Replies from: Dagon, Luke_A_Somers
comment by Dagon · 2017-02-21T22:13:27.021Z · LW(p) · GW(p)

Christian's question is spot on. What he doesn't say is the reason he's asking. What you're describing isn't a belief, it's a somewhat vague cluster of beliefs. different beliefs in the cluster can have different credence levels, and treating them as a unit means it's unanswerable how accurate you are.

Decompose your question to specific falsifiable statements. You should believe whatever lets you most accurately predict the future conditional on your choices.

So; what choices are you facing where beliefs on this topic pay rent? Or, if you prefer, what predictions are you testing with the belief?

comment by Luke_A_Somers · 2017-02-27T20:46:10.810Z · LW(p) · GW(p)

Why should you believe any specific conclusion on this matter rather than remain in doubt?

comment by Elo · 2017-02-23T07:14:58.052Z · LW(p) · GW(p)

Iq tests tell something, usually that clusters with intelligence. But there are many ways for that to go wrong.

comment by [deleted] · 2017-02-21T05:48:50.106Z · LW(p) · GW(p)

Game Theory Question:

So I recently just bumped into this paper on a more optimal algorithm for winning IPD's (beating out Tit for Tat). I'm not parsing the paper, well, though. It appears that, given some constraints on the algorithms playing the game between players X and Y, X can unilaterally determine Y's score?

Apparently having a "theory of mind" somehow increases your ability to "extort" (i.e. unilaterally dictate) opponents?

Um, so I'm not an expert in this field, but I'm wondering if this has any bearing on decision theory? My current understanding is something like "this appears only to be related to toy IPD problems and if two humans running some sort of TDT/superrational like algorithm where it was common knowledge that the other player would act they way they would have wanted to precommit to, then the results in the paper don't matter much."

But can someone more knowledgeable chime in?

Replies from: MrMind
comment by MrMind · 2017-02-21T08:47:21.039Z · LW(p) · GW(p)

I recall a paper written by a student of Scott Aaronson about an IPD tournament (mentioned in the article about Eigenmorality). Indeed the winners were agents that kept a model of the opponent and responded in kind: T-f-T wasn't by far the optimal algorithm.
On the other side, IPDs is what you have in a society where different agents are trying to cooperate / compete for resources. Clearly, super-rational agents (i.e. agents that have access to each other source code and are reflexively coherent) will act according to the same information, so no exploitation is possible, but this is an extreme case, better suited to treat problems in artificial coordination, rather than describing a real situation.
Indeed some psychologists (e.g. Haidt) think that language and higher cognition evolved to serve the need of a "theory of mind" (model and influence other agents).

comment by Fluttershy · 2017-02-20T15:15:14.669Z · LW(p) · GW(p)

Second edit: Dagon is very kind and I feel ok; for posterity, my original comment was basically a link to the last paragraph of this comment, which talked about helping depressed EAs as some sort of silly hypothetical cause area.

Edit: since someone wants to emphasize how much they would "enjoy watching [my] evaluation contortions" of EA ideas, I elect to delete what I've written here.

I'm not crying.

Replies from: Dagon, Dagon, None
comment by Dagon · 2017-02-21T05:20:49.137Z · LW(p) · GW(p)

eep! I deeply apologize that my remarks have caused you pain. I am skeptical of EA, and especially the more ... tenuous causal and ethical calculations that are sometimes used to justify non-obvious charities. But I deeply respect and appreciate everyone who is thinking and acting with the intent to make the world better rather than worse, and my disbelief in the granularity of calculation is tiny and unimportant compared to my belief that individuals who want to make a difference can do so.

Also, I cry at the drop of a hat, so if you start I'm definitely joining you out of both shame and sympathy.

Replies from: Fluttershy
comment by Fluttershy · 2017-02-21T06:59:52.307Z · LW(p) · GW(p)

Ok, thank you, this helps a lot and I feel better after reading this, and if I do start crying in a minute it'll be because you're being very nice and not because I'm sad. So, um, thanks. :)

comment by Dagon · 2017-02-20T16:12:51.293Z · LW(p) · GW(p)

I'd enjoy watching the evaluation contortions that an EA would have to go through to decide that their best contribution is to help a specific not-very-effective (due to mental health problems or disability) contributor rather than more direct contributions.

Uncertainty is multiplied, not just added, with each step in a causal chain. If you're trying to do math on consequentialism (let alone utilitarianism, which has further problems with valuation), you're pretty much doomed for anything more complicated than mosquito nets.

Edit - leaving original for the historical record. OMG this came out so much meaner than I intended. Honestly, even small improvements in depression across many sufferers seems like it could easily multiply out to huge improvements in human welfare - it's a horrible thing and causes massive amounts of pain. I meant only to question the picking of individuals based on their EA intentions and helping them specifically rather than scalable options for all.

Replies from: None
comment by [deleted] · 2017-02-20T19:03:26.396Z · LW(p) · GW(p)

EDIT: Replied to wrong OP.

I'm pretty unsure about statistics for this. Depression seems to be about six to ten percent of the population.

So, are there strong arguments that disproportionately high amounts of promising EAs have depression / disabilities?

I can steelman a sort of consequentialist argument for redirecting existing efforts to help disabled people towards the most promising, high-value people, but I'm more curious if anyone has info about mental health and the EA community.

comment by [deleted] · 2017-02-20T19:04:14.875Z · LW(p) · GW(p)

I'm pretty unsure about statistics for this. Depression seems to be about six to ten percent of the population.

So, are there strong arguments that disproportionately high amounts of promising EAs have depression / disabilities?

I can steelman a sort of consequentialist argument for redirecting existing efforts to help disabled people towards the most promising, high-value people, but I'm more curious if anyone has info about mental health and the EA community.

Replies from: ChristianKl
comment by ChristianKl · 2017-02-21T13:39:18.159Z · LW(p) · GW(p)

So, are there strong arguments that disproportionately high amounts of promising EAs have depression / disabilities?

Even when it's not disproportionately high for EA's if it's 8% of EAs that might be enough. I think it's plausible that a psychologist who specializes on helping EA people does better at helping EA people than the average psychologist.

If a psychologist already understands worries about AGI destroying humanity it's easier for the patient to talk to them about it.

Replies from: Dagon, None
comment by Dagon · 2017-02-21T16:44:20.426Z · LW(p) · GW(p)

I'll try to be gentler about my concern, but I really do want to caution against EA interventions that are targeted at EA members. Helping someone is a pure good, but there's both a bias problem and an optics problem with helping people because they're similar to yourself.

(and note: one of the reasons I don't consider myself to be part of EA is that I prefer to help people close or similar to myself disproportionately to the amount of net human impact. I'm not saying "don't do that", just "be careful not to claim that EA justifies it").

Replies from: ChristianKl
comment by ChristianKl · 2017-02-22T10:28:43.576Z · LW(p) · GW(p)

When it comes to publically recommending causes it's worthwhile to focus on projects with good optics like the GiveWell recommended charities. At the same time it's okay if individual people decide that they believe projects with worse optics are high impact interventions.

To the extent that there are fuzzies involved in helping fellow EA people, it's worthy to acknowledge the fact and be conscious that they are part of the reason for your donation but in generating fuzzies isn't a reason against donating.

Replies from: Dagon
comment by Dagon · 2017-02-23T00:08:54.162Z · LW(p) · GW(p)

Thanks, that said it better than I did.

I don't mean to discourage helping friends, family, neighbors, or other groups where you're a member. Or anyone else - all charity is good. I only wanted to point out that EA loses credibility if it suspiciously turns out that the detailed calculations and evaluation of options give clear support to your friends/co-believers.

Replies from: Viliam
comment by Viliam · 2017-02-23T10:56:10.023Z · LW(p) · GW(p)

I guess it needs to be made even more obvious that one can help their friends without having (or pretending to have) an exact calculation proving that this is the optimal thing to do.

comment by [deleted] · 2017-02-21T14:32:22.926Z · LW(p) · GW(p)

Hm, okay, i hadn't thought about it like this.

I agree that this might be a niche role. But I'm still unsure about the demand. There's about 12,000 people in the FB group. If that's conservatively about 10% of all EAs, we're still only looking at about 120,000 people, and then only about 9,600 potential patients, spread across the entire globe.

Then again, I admit I really don't know how demand works for psychology (is ~10,000 potential patients enough?), and those are just ballpark works.

Replies from: ChristianKl
comment by ChristianKl · 2017-02-21T17:27:25.291Z · LW(p) · GW(p)

A psychologist who does weekly 1-hour sessions with their patients might have 40 patients at one time if they work 40 hours a week and just spent time with patients. I think it's likely that you want the person to do more than just 1-on-1 work and also write a few blog posts about what they learn, so 30 patients at a time might be a decent count.

CBT can be done via Skype, so the fact that patients are spread over the globe isn't a problem.

According to the Mayo clinic CBT takes an average of 10-20 sessions (http://www.mayoclinic.org/tests-procedures/cognitive-behavioral-therapy/details/what-you-can-expect/rec-20188674). That means you might change patients every 3 months.

That means your therapist might treat 120 people in a year. It would be fine to fund a single therapist for this task as an MVP. If you would have a single therapist who the EA community holds in high regard I would estimate that the person can find those 120 people to treat.

Replies from: None
comment by [deleted] · 2017-02-21T17:50:09.472Z · LW(p) · GW(p)

Cool. Thanks for the stats on how psychologists work; all this is new to me. A sort of Schelling therapist who's able to help w/ people in the EA community does seem like a force multiplier / helpful thing to have, I guess.

comment by tukabel · 2017-02-20T12:31:51.990Z · LW(p) · GW(p)

So Bill Gates wants to tax robots... well, how about SOFTWARE? May fit easily into certain definitions of ROBOT. Especially if we realize it is the software what makes robot (in that line of argumentation) a "job stealing evil" (100% retroactive tax on evil profits from selling software would probably shut Billy's mouth).

Now how about AI? Going to "steal" virtually ALL JOBS... friendly or not.

And let's go one step further: who is the culprit? The devil who had an IDEA!

The one who invented the robot, its application in the production, programmer who wrote the software, designed neural nets, etc.

So, let's tax ideas and thinking as such... all orwellian/huxleyian fantasies fade short in the Brave New Singularity.

Replies from: AspiringRationalist, username2, gjm, ChristianKl
comment by NoSignalNoNoise (AspiringRationalist) · 2017-02-20T22:41:43.529Z · LW(p) · GW(p)

Can we please bring back downvoting?

comment by username2 · 2017-02-20T13:41:30.703Z · LW(p) · GW(p)

I'd say that you are not supposed to tax people, you are supposed to tax flows of money, e.g. income, profit, sales, etc.

comment by gjm · 2017-02-20T17:28:49.053Z · LW(p) · GW(p)

And let's go one step further: who is the culprit? The devil who had an IDEA!

This is the point at which the proposal becomes obviously insane. Not coincidentally, it is also the point at which the proposal stops having anything to do with the thing Bill Gates said he was in favour of. (It is more like saying "we tax income people get from doing their jobs, so we should tax those people's parents for producing a person who did work that yielded taxable income".)

As username2 says, what gets taxed is acquisition of money; when I pay income tax it isn't a tax on me but on my receipt of that income. If anything like a "robot tax" happens, here's the right way to think of it: a company is doing the same work while employing fewer people, so it makes more profit, and it pays tax on that profit so more profit means more tax. We are generally happy[1] taxing corporate profits, and we are generally happy[2] taxing companies when their profitable activities impose nasty externalities on others, and some kinds of "robot tax" could fit happily into that framework.

[1] Perhaps you aren't. But most of us seem to be, since this is a thing that happens all over the world and I haven't seen much objection to it.

[2] This isn't so clear; I've not seen a lot of objection to taxes of this sort, but I also think they aren't used as much as maybe they should be, so maybe they are unpopular.

(For what it's worth, I am not myself in favour of a "robot tax" as such, but if we do find that robots or AI or other technological advances make some kinds of business hugely more profitable then I think it's reasonable for governments to look for ways to direct some of the benefit their way, to be used to help people whose lives become more difficult as machines get good at doing what used to be humans' jobs.)

Replies from: knb, bogus, Dagon
comment by knb · 2017-02-21T06:04:39.793Z · LW(p) · GW(p)

Isn't a VAT already basically a Robot Tax?

Replies from: Viliam, gjm
comment by Viliam · 2017-02-21T11:58:56.875Z · LW(p) · GW(p)

That would explain all those sci-fi robots who only walk around destroying stuff and never build anything. They were programmed with an incentive to keep the VAT low, they took it too literally, and things got out of control.

comment by gjm · 2017-02-21T12:38:00.117Z · LW(p) · GW(p)

Seems less so than a tax on corporate profits is. Am I missing something?

comment by bogus · 2017-02-20T18:27:28.257Z · LW(p) · GW(p)

and we are generally happy[2] taxing companies when their profitable activities impose nasty externalities on others

Maybe true, but the sort of externality that occurs when some jobs are paid less because of robots is a pecuniary externality, not a real externality - so the usual argument for taxing these activities doesn't quite apply. Now, taxation of capital is actually somewhat justified (and robots are capital, obviously), but really only as an indirect taxation of especially valuable skill endowments (such as, hypothetically, the skill of repairing robots, or superintending a robot-reliant business) - and then only at rather mild levels that are already in play with the current income tax. (If income redistribution was not a factor, you'd rather tax consumption, labor income and resource rents + real externalities).

comment by Dagon · 2017-02-20T17:47:43.832Z · LW(p) · GW(p)

Actually, you don't even need to tax corporate profits in this scenario. Just tax when actual people get money - company makes more profit, eventually it needs to distribute that profit to shareholders (dividends) or employees (higher wages for the non-displaced). Tax at that point, not along the way.

Replies from: niceguyanon
comment by niceguyanon · 2017-02-22T21:00:10.291Z · LW(p) · GW(p)

I dunno, it's hard enough trying to determine if and where profit was made, in order to tax it. If we didn't tax profits and only distributions then there would be no taxes to collect. Companies and individuals would all claim that any profit are being retained for future investment or for hoarding and not actually distributed to owners. That is why we tax non distributed retained earning.

comment by ChristianKl · 2017-02-20T13:31:29.946Z · LW(p) · GW(p)

There won't be a blanket tax on all robots but self-driving cars and trucks can be taxed directly.

Taxing them enough to reduce their usage means less carbon emissions.

Replies from: username2
comment by username2 · 2017-02-21T01:32:29.918Z · LW(p) · GW(p)

If your goal is to reduce carbon emissions, then tax the gasoline.

Replies from: ChristianKl
comment by ChristianKl · 2017-02-21T09:53:25.004Z · LW(p) · GW(p)

Politically taxing gasoline is very unpopular and there's no majority for carbon taxes.

Replies from: fubarobfusco
comment by fubarobfusco · 2017-02-22T16:24:24.494Z · LW(p) · GW(p)

Politically, taxing gasoline is utterly commonplace and accepted. Every developed country except Mexico does it, and every U.S. state.

Replies from: ChristianKl
comment by ChristianKl · 2017-02-22T16:32:37.695Z · LW(p) · GW(p)

In the US it is not high enough to fully pay for the highway infrastructure because it's politically unpopular.