Survey on X-risk: Feedback needed

post by vallinder · 2011-09-26T22:15:45.241Z · LW · GW · Legacy · 18 comments

 

Together with Jesper_Ostman, I'm currently preparing an mTurk survey on the public perception of extinction threats (aiming for a sample size of about 400). Below is our current draft. Feedback is much appreciated. We are planning to do a few follow-up studies, so in this one we want to keep things simple for the most part. In question 3, we want to compare extinction risk reduction to something that is generally perceived as good but not too closely related to it, but perhaps our current choice isn't the best. In addition to the demographic questions we will also include a brief 10-question personality inventoryWe’re both looking for information on the public perception of x-risk and what demographic groups and personality types might have the most potential for getting involved.

1.  How likely do you think it is that humanity has gone extinct by the year...

a) 2050

(i)    0-20%
(ii)   20-40%
(iii)  40-60%
(iv) 60-80%
(v)  80-100%
b) 2100
(i)    0-20%
(ii)   20-40%
(iii)  40-60%
(iv) 60-80%
(v)  80-100%

c) 2200
(i)    0-20%
(ii)   20-40%
(iii)  40-60%
(iv) 60-80%
(v)  80-100%
d) 2500

(i)    0-20%
(ii)   20-40%
(iii)  40-60%
(iv) 60-80%
(v)  80-100%

e) 10 000

(i)    0-20%
(ii)   20-40%
(iii)  40-60%
(iv) 60-80%
(v)  80-100%


2. What do you think is the most likely cause (causes) of human extinction?

_______________________________________________________


3. How important do you think reducing the risk of human extinction is, compared to giving foreign aid?

a) Much more important
b) More important
c) Equally important
d) Less important
e) Much less important


4a. Age:   __________ b. Gender: __________       c.      Nationality: __________


5. What is your current occupation?

_______________________________________________________

6. If you are a student, what subject are you majoring in?


_______________________________________________________


7. What is your level of education?

_______________________________________________________

 

18 comments

Comments sorted by top scores.

comment by Oscar_Cunningham · 2011-09-26T23:37:08.634Z · LW(p) · GW(p)

I expect many people will have never even considered the possibility of human extinction, so maybe question two should be rephrased to allow for a "What, you mean like, extinct? But... It can't..." style response.

Replies from: vallinder
comment by vallinder · 2011-09-27T16:04:19.991Z · LW(p) · GW(p)

Thank you, this is a good suggestion. Do you think "What, if any, do you think is the most likely cause (causes) of human extinction?" is clear enough?

Replies from: Oscar_Cunningham
comment by Oscar_Cunningham · 2011-09-27T17:36:57.850Z · LW(p) · GW(p)

Sounds good.

comment by Scott Alexander (Yvain) · 2011-09-27T09:05:31.407Z · LW(p) · GW(p)

I don't know how important it will be for a naive sample, but if the sample includes transhumanists, you might need to clarify your definitions of "humanity" and "extinct". If we transition to beings of pure energy that bear no more relation to humans than humans do to cockroaches, is that "humanity becomes extinct" or not?

Replies from: vallinder
comment by vallinder · 2011-09-27T15:58:09.490Z · LW(p) · GW(p)

Yes, our thought was that it doesn't matter for a naive sample. But if there is a concise way of clarifying, we'll consider it.

Replies from: billswift
comment by billswift · 2011-09-28T03:48:56.388Z · LW(p) · GW(p)

"Humans or their direct descendants"?

comment by Owen · 2011-09-27T03:49:50.414Z · LW(p) · GW(p)

I feel like question 1 could be tweaked so that it's harder to put in wrong answers (in this case, not weakly increasing probability estimates). Maybe you could ask for the probabilities that humanity will go extinct in certain ranges of time (e.g. "How likely do you think it is that humanity survives to the year 2100 but goes extinct by 2200?"). Or, to circumvent the condition that the probabilities add to less than 100%, you could condition: "Assuming that humanity survives to the year 2100, how likely do you think it is that humanity then goes extinct by 2200?"

I only make these suggestions because I can imagine someone reading the original questions and thinking "Hmm, yes, it seems pretty likely that we annihilate ourselves by 2100: 40-60%" and then putting down 0-20% for part (e) because it's so much harder to think of ways to go extinct that take thousands of years.

And I would reverse the order of 6 and 7: "What is your level of education? If you are a college student, what is your area of study?" And if you want people's past experience to count too, you could ask instead "If you have or are earning a college degree, what is/was your area of study?"

Replies from: vallinder
comment by vallinder · 2011-09-27T16:06:03.743Z · LW(p) · GW(p)

I like your suggestion with conditional probabilities. Reversing 6 and 7 is also good.

comment by lessdazed · 2011-09-27T02:02:50.866Z · LW(p) · GW(p)

How important do you think reducing the risk of human extinction is, compared to giving foreign aid?

These are not parallel things, because in one choice, an effect is achieved and in the other, money is put to acheving an effect. The following are parallel:

How important do you think reducing the risk of human extinction is, compared to influencing foreign countries to treat their people well and not wage war?

How important do you think spending money to try and reduce the risk of human extinction is, compared to giving foreign aid?

A better question would be to ask what percentage of the Federal budget should be spent on it, or how much money should be borrowed per year to spend on it. The foreign aid bit is too ideological, and the answer will depend on a variable - what people think is spent on foreign aid.

Replies from: vallinder
comment by vallinder · 2011-09-27T16:02:26.108Z · LW(p) · GW(p)

Thanks for your suggestions. I agree that foreign aid is too political, but it may well be that some people consider waging war an extinction risk, which is why I don't think it's a good alternative for comparison.

Replies from: lessdazed
comment by lessdazed · 2011-09-27T16:12:47.314Z · LW(p) · GW(p)

It isn't meant to be an alternative, that's a rephrasing of how I think of foreign aid's purpose, which reinforces why it's a bad idea to introduce something political that is thought of differently by each of your subjects.

When I think of foreign aid, I think of giving Egypt and Jordan vouchers for American weapons as a bribe not to attack Israel, giving the same to Israel so they don't fall behind, and giving the same to Pakistan so they don't attack India or Americans in Afghanistan.

Replies from: vallinder
comment by vallinder · 2011-09-27T16:17:17.603Z · LW(p) · GW(p)

Oops, then I misunderstood you.

I wonder if it's possible to find something that is both seen as good by almost everyone and specific enough.

Replies from: lessdazed
comment by lessdazed · 2011-09-27T16:20:20.233Z · LW(p) · GW(p)

What's wrong with asking how much money should be borrowed to pay for it or how much all programs should be cut percentage wise to pay for it? You won't find an apolitical political program to compare X-risk spending to.

comment by [deleted] · 2011-09-27T01:35:15.887Z · LW(p) · GW(p)

Support for foreign aid is a politically polarized issue. Unfortunately, I can't think of a great replacement that isn't too general ("charity" for instance). As an alternative, you could think of two things which don't follow the same political fault lines. Question 3 could ask for a comparison to foreign aid, and Question 4 could ask for a comparison to something else.

Otherwise it looks fine!

comment by Tripitaka · 2011-09-26T22:34:32.967Z · LW(p) · GW(p)

Which distribution of answers to 2) and 3) do you expect here on LessWrong? If your goal is to survey the opinion of average people, you will fail that goal the moment you take the answers here into account.

Replies from: MichaelHoward
comment by MichaelHoward · 2011-09-27T01:20:35.192Z · LW(p) · GW(p)

I think he wants feedback about the test from us, not for us to be part of the test sample.

comment by rwallace · 2011-09-27T20:28:47.081Z · LW(p) · GW(p)

The problem with question 1 is that it makes the implicit assumption that either we have become extinct or things are more or less okay. I'm confident there will be still people around in, say, a thousand years. But it is the way of extinction that the outcome is often decided long before, and by factors having nothing to do with the cause of, the death of the last individual. If we fail to take this shot at escaping our boundaries, I'm not at all confident we'll get another chance. We could end up in a scenario where our species still exists but extinction by ordinary geological processes has become inevitable.

The problem with question 3 is this it makes the implicit assumption that spending money with the stated aim of reducing the risk of extinction, will have the effect of reducing that risk rather than increasing it. Both the theory of how human psychology works in far mode, and experience with trying to spend money on politically charged good causes, suggests the opposite.

You aren't obliged to agree with me on these points, but as they are the primary issues, if you are producing a document that claims to be a questionnaire, I will suggest it would be appropriate to spell out the primary issues as explicit questions rather than making them implicit assumptions.

comment by DuncanS · 2011-09-26T22:32:40.191Z · LW(p) · GW(p)
  1. For all timeframes, 0 - 20%.
  2. None are likely, but some kind of bioweapon seems most probable? An AI problem is also possible, but not highly likely.
  3. e) Much less important 4a Age 44, Gender male, Nationality British
  4. Financial modeller
  5. Degree