Rationality Activism: Open Thread

post by atucker · 2011-03-04T02:06:56.838Z · LW · GW · Legacy · 18 comments

Contents

18 comments

So as I read around the discussion section I keep coming across ideas (like thisthis, this, or this) which all seem to be very related to the same topics:

Other posts talk about similar things.

I have been interested in this for a while now, and have gotten some great feedback.

But now I'm wondering how many groups, other than SIAI, are trying to do this. It seems like it would be silly to have something like this stagnate because of a simple coordination problem.

So if you are, please come forth and comment. If you're interested, do the same. Share what you know, learn from others, maybe maybe maybe get the ball rolling a bit more.

Note:
I changed the title (originally "Rationality Activism Groups") to reflect the more discussion-oriented nature of this thread.

18 comments

Comments sorted by top scores.

comment by atucker · 2011-03-04T05:39:26.828Z · LW(p) · GW(p)

Here's a quick and dirty bullet list of what I know on this subject

From comments:

  • People remember and value rationality more when you make it immediately provide useful benefits in their life

  • People don't like being told what to do, but they do like being helped

From experience:

  • Inferential distance makes a really big difference in speed of comprehension, and speed of agreement, but I don't know if it results in more followthrough

  • Its easier to convince people who like you

  • People do not automatically pick up on your thought process as you go through it, being rational often looks like being insightful

Corroborated by both:

  • Being nice and encouraging to people who are helping you makes it more likely for them to help in the future

  • It helps to give people specific advice about what to do/how to apply something, at least at first

  • There's a difference between getting someone to agree with something, and getting them to follow through on that belief

  • People need to want to learn something, being positive and upbeat in your presentation is more effective than downbeat, scary, or guilt-tripping

comment by whpearson · 2011-03-04T22:24:56.563Z · LW(p) · GW(p)

In terms of spreading rationality it might be best to take the indirect approach. Consider, the people that invented the computers we use meant that lots of us know computer programming and have the vocabulary to talk about bayes theory.

So one possibility is to invent very useful technologies that require people to have a rationalist mind set in order for them to use.

I'm also of a show don't tell mindset, if we become super successful at getting what we want people will want to read out biographies and emulate us. This approach also keeps us honest, if we aren't more successful or interesting than other people then we probably don't have much worth talking to them about.

In short I don't like trying to convince people directly.

Replies from: paulfchristiano, atucker
comment by paulfchristiano · 2011-03-05T00:45:24.754Z · LW(p) · GW(p)

if we aren't more successful or interesting than other people then we probably don't have much worth talking to them about

I would like to convince other people to play the same game as me. Rather than having very smart people thinking about how to solve some arbitrarily selected hard problem, I would rather they take a step back and make a desperate effort on humanity's behalf.

I think this is true regardless of whether I am more successful or interesting than they are.

Replies from: atucker
comment by atucker · 2011-03-05T01:11:59.377Z · LW(p) · GW(p)

if we aren't more successful or interesting than other people then we probably don't have much worth talking to them about

and

I would like to convince other people to play the same game as me. Rather than having very smart people thinking about how to solve some arbitrarily selected hard problem, I would rather they take a step back and make a desperate effort on humanity's behalf.

I don't think that these two ideas are at all contradictory. Ideally, we'd want people to to step back and try and help humanity, but in reality I think it would probably be very helpful to the goal of getting more rationalists (maybe even necessary?) for rationalists to be clearly successful.

comment by atucker · 2011-03-05T01:24:04.370Z · LW(p) · GW(p)

I found these ideas pretty interesting.

In terms of spreading rationality it might be best to take the indirect approach.

I was thinking about that a bit, how increasing something like economic prosperity seems to help a lot of things (education, governmental stability, decreased violent crime, etc.) as well as other effects like decreasing religiosity. Note: I'm not going to try and cite these claims, it was more the general idea than these specifics. If anyone wants to make a rationality promoting-plan based on this strategy then more research needs to be done.

Consider, the people that invented the computers we use meant that lots of us know computer programming and have the vocabulary to talk about bayes theory.

I think that you're right that inventing computers made considerably more people learn computer programming, but I'm not sure how much it's raised the sanity waterline for non-programmers.

So one possibility is to invent very useful technologies that require people to have a rationalist mind set in order for them to use.

This was, to me, the most intriguing idea. Do you have any vague ideas of technologies which require rationality to use? It seems pretty easy to get a cargo-cult understanding of something and be able to use it without understanding how it works.

I guess it could also happen with something that only rationalists would want to use, like cryonics or mind uploading. Though, I feel like part of the reason to want to promote rationality is to speed up development of things like that.

comment by atucker · 2011-03-04T02:14:30.171Z · LW(p) · GW(p)

If someone else comments, I'll come back and elaborate on a few of my lessons, and things I think I kind of learned.

Replies from: lukeprog
comment by lukeprog · 2011-03-04T05:05:57.613Z · LW(p) · GW(p)

Yes please.

comment by Vertigo · 2012-10-18T18:20:04.058Z · LW(p) · GW(p)

My position is that we need a plan. A long-term, comprehensive strategy to maximize the utility of our individual efforts toward making the world a more rational place. We need not only to study the best ways of teaching rationality on the level of personal interactions and small classes, but to plot a path from the current state of society to a world in which people are trained from childhood in the methods of rationality. I'm giving a talk on this very topic to my university's secular student alliance. My main message is simply that "we need a plan", but here's the specific proposal I'm going to toss out there for consideration.

  • Use the grassroots movement(s) in secularism, humanism, science advocacy, etc. to prepare the memetic landscape for education renovation.

--Broaden conceptions of science.

~~Let "science" include the analysis, interpretation, and implementation of information gained through empirical inquiry.

-The positivists have a narrower conception, but it's not very historically accurate anyway.

-The broader conception lets us teach parts of rationality in science classes as science that the one infected by positivism doesn't.

~~Get the paradigmatic "scientist" out of a lab coat and into the Real World so that everyone's a scientist and the people who design space ships are simply professionals.

--Get involved in math education reform. See Hemant's efforts. Mathematical methods are methods of rationality. Math is not graphing parabolas. Math is creative rigorous problem solving.

--Learn about and promote cognitive science. Popularize the term "cognitive psychology". I know it kinda hurts to drag down cog sci like that, but the plan is to sneak the study of heuristics and biases into high school psychology classes.

  • Insert rationality into math and science classes. Focus psychology classes toward cognitive science. (It may prove more feasible to add cog sci to the options for science classes along side chemistry, physics, etc., but I think this will be easier at least early on.)
  • Slowly increase the number of schools with philosophy so that extra-empirical methods get their own spotlight.
  • Get states to adopt curricula that focus on higher order information.
  • Implement such a curriculum on the national level.
comment by bisserlis · 2011-03-04T07:58:47.921Z · LW(p) · GW(p)

I commit to responding at length to this thread tomorrow when I have more time, but here is a little bit of my background and what I'm into now in case anyone wants to ask specific questions.

I ran my university's skeptic/atheist group for two years and ended up getting involved at the national level with the Center for Inquiry this past summer as an outreach intern at their headquarters. I moved out to the Bay Area specifically to get involved with pro-reason causes.

Here is a group that just started that I'm involved with: Reason for Reason.

Replies from: atucker
comment by atucker · 2011-03-05T14:55:59.612Z · LW(p) · GW(p)

Looking forward to your input.

comment by atucker · 2011-03-05T04:24:49.463Z · LW(p) · GW(p)

Specific prompts or inputs help people have more detailed responses, hitting a target in a possibility space is easier than finding said target as well.

comment by curiousepic · 2011-03-04T17:07:49.546Z · LW(p) · GW(p)

Planet Inc. may be worth interacting with - watch the TEDx talk on this page.

comment by nazgulnarsil · 2011-03-04T02:43:24.563Z · LW(p) · GW(p)

activism should target elites. appeals to grass roots is time better spent getting personally rich.

Replies from: atucker
comment by atucker · 2011-03-04T02:47:28.707Z · LW(p) · GW(p)

Do you have any support for/experience supporting this assertion?

Replies from: folkTheory, nazgulnarsil, Dorikka
comment by folkTheory · 2011-03-04T22:51:07.847Z · LW(p) · GW(p)

This is what the Church of Scientology does.

comment by nazgulnarsil · 2011-03-04T02:55:20.143Z · LW(p) · GW(p)

I don't have a specific cite, but the world of activism IME is the epicenter of scope insensitivity.

comment by Dorikka · 2011-03-04T03:49:17.131Z · LW(p) · GW(p)

I'm thinking that creating activist elites would bring in many more resources to be used for further activism than would creating activists out of the less wealthy, and that those usually classified as elites usually make more significant decisions than non-elites (as a result, creating rationality in elites would have more of an impact than doing so in the same number of non-elites.)

Replies from: atucker
comment by atucker · 2011-03-04T05:29:19.544Z · LW(p) · GW(p)

I'd believe that.

Though, I feel like it would be easier to convince non-elites that they need to be more rational.