Red Teaming Climate Change Research - Should someone be red-teaming Rationality/EA too?

post by casebash · 2017-07-07T02:16:29.949Z · LW · GW · Legacy · 6 comments

Contents

6 comments

6 comments

Comments sorted by top scores.

comment by casebash · 2017-07-07T02:17:19.246Z · LW(p) · GW(p)

Link doesn't seem to be working: http://reason.com/blog/2017/07/06/red-teaming-climate-chang1

Replies from: turchin
comment by turchin · 2017-07-07T10:13:40.453Z · LW(p) · GW(p)

I have to add that there is (informally) even smaller purple team, which thinks that climate change could happen sooner and in more violent form, like runaway global warming. The idea has similarities with the idea of self-improving AI as in both cases unstoppable process with positive feedback will result in human extinction in 21 century.

comment by scarcegreengrass · 2017-07-07T14:25:00.349Z · LW(p) · GW(p)

Personally i think people often do indeed write rigorous criticisms of various points of rationality and EA consensus. It's not an under-debated topic. Maybe some of the very deep assumptions are less debated, eg some of the basic assumptions of humanism. But i think that's just because no one finds them faulty.

comment by username2 · 2017-07-07T04:59:41.091Z · LW(p) · GW(p)

I thought that's what Lumifer is doing ;)

Replies from: Lumifer
comment by Lumifer · 2017-07-07T14:32:49.738Z · LW(p) · GW(p)

Sigh. It used to be called science. Just science. But in our enlightened age in order to do plain-vanilla science you need to reframe it as a war between the Blues and the Reds?

Replies from: username2
comment by username2 · 2017-07-07T14:40:31.996Z · LW(p) · GW(p)

Point taken, and I agree.

Edit: generalizing, I think it should be said that rather than need big a red team, there is really no room for a blue team. Everyone should be red teaming their own beliefs and generally accepted "truths." That is part and parcel what it means to be a rationalist. To practice rationality it to approach everything with a "red team" mindset.