Rationality and Climate Change
post by Emiya (andrea-mulazzani) · 2020-10-05T10:21:09.512Z · LW · GW · 26 commentsThis is a question post.
Contents
Answers 50 jimrandomh 30 moridinamael 19 lsusr 15 Cato the Eider 11 ChristianKl 11 Liron 9 kithpendragon 8 shminux 8 Rafael Harth 7 Richard Horvath 6 D0TheMath 4 Axiomata 2 Charlie Steiner 1 Maria Y 1 Jens Nordmark None 26 comments
My master thesis was on the subject of climate change and mass media attitude toward it. Working on it, I've read over 300 newspaper articles on the subject and studied a large number of papers regarding its consequences, its solutions, and the attempts to frame the debate over them, so I consider myself to be well informed on it.
In the months I've been on this site though, I've been greatly surprised by the apparent unconcerned attitude many of the users have toward this theme.
Given that
- to the best of my knowledge, I think that we should instead be very concerned
- the subject is certainly of a great relevance since a lot of investments and global consequences in the future years will revolve around it
- we can't just agree to disagree
I plan to write a post that would cover this subject and hopefully resolve the disagreement.
Before doing that, though, I'd find extremely useful if people would write me what they believe, feel and anticipate about climate change, its future consequences and the processes that would be required to stop it, and why they think they believe, feel and anticipate that way.
I'm greatly interested in receiving replies both from people feeling alarmed and from people not feeling alarmed.
If, while reading this question or writing me your reply, you realised you'd like to document yourself more on the subject and update your beliefs, that's perfectly fine, but I'd ask that you write me your thoughts about it before doing so. If you, instead, already documented about it before reading this question, that's perfectly fine and I'd like to read your thoughts anyway.
What I'm trying to understand are the beliefs of the users of this site at the current moment.
You can be as synthetic or detailed as you'd like, and can write me either a reply under this post or a private message.
I'd like to thank in advance everyone who'll send me their thoughts and who'll dedicate this question a bit of their time.
Answers
For most people, climate change is pretty much the only world-scale issue they've heard of. That makes it very important (in relative terms); climate change has a world-scale impact, and no other issues they're familiar with do, so it's very important.
LessWrong has a history of dealing with other world-scale issues, and EA (an overlapping neighboring community) likes to make a habit of pointing out all the cause areas and weighing them against each other. When climate change is weighed against AI risk, animal welfare, biosafety, developing-world poverty, and various meta-level options... well, AGW didn't get any less important in absolute terms, but you can see why people's enthusiasm and concern might lie elsewhere.
As a secondary issue, this is a community that prides itself on having high epistemic standards, so when the advocates of a cause area have conspicuously low epistemic standards, it winds up being a significant turn-off. When you have a skeptical eye, you start to automatically notice when people make overblown claims, and recommend interventions that obviously won't help or will do more harm than good. Most of what I see about AGW on social media and on newspaper front pages falls into these categories, and while this fact isn't going to show up on any cause-prioritization spreadsheets, on a gut level it's a major turnoff.
For an example of what I'm talking about, look into the publicity surrounding hydrogen cars. They're not a viable technology, and this is obvious to sufficiently smart people, but because they claim to be relevant to AGW, they get a lot of press anyways. The result is a con-artist magnet and a strong ick-feeling which radiates one conceptual level out to AGW-interventions in general.
↑ comment by ChristianKl · 2020-10-07T15:53:02.616Z · LW(p) · GW(p)
For an example of what I'm talking about, look into the publicity surrounding hydrogen cars. They're not a viable technology, and this is obvious to sufficiently smart people, but because they claim to be relevant to AGW
Elon made his bet on battery driven cars. It's not clear to me that it's the right call. Hydrogen can be stored for longer timeframes which means that in a world where most of our energy comes from solar cells you can create it from surplus energy in the summer and use it up in the winter while batteries can only charge with energy that's available at the particular time you want to charge your car.
Replies from: AnthonyC↑ comment by AnthonyC · 2020-10-23T12:41:44.639Z · LW(p) · GW(p)
Whether or not hydrogen driven cars become viable, there are strong arguments (Nikola Motors aside) that hydrogen trucks are likely to become viable, moreso than battery trucks, because of the energy demands and long distances driven. Autonomous vehicles could change the situation again, in either direction, depending on recharge vs refuel time and electricity vs hydrogen costs, and the price of that time and fuel relative to the value of vehicle uptime, especially for shared and fleet vehicles.
↑ comment by steven0461 · 2020-10-07T17:16:06.497Z · LW(p) · GW(p)
For most people, climate change is pretty much the only world-scale issue they've heard of. That makes it very important (in relative terms)
Suppose climate change were like air pollution: greenhouse gas emissions in New York made it hotter in New York but not in Shanghai, and greenhouse gas emissions in Shanghai made it hotter in Shanghai but not in New York. I don't see how that would make it less important.
Replies from: AnthonyC↑ comment by AnthonyC · 2020-10-23T12:52:45.781Z · LW(p) · GW(p)
Local control seems very relevant to me, given our lack of powerful global governing institutions and other coordination mechanisms. In the localized-climate-change world, Shanghai may decide they're willing to tolerate more climate change than NY is, in which case NY can just pay more to prevent it and then reap the benefits themselves.
Our world doesn't have that option, which has been one of the biggest stumbling blocks to real at-scale action on climate change for as long as I've been alive. Instead we have to deal with all the political history of every current and past-but-remembered political conflict anyone can possibly dig up to negotiate concessions from one another.
↑ comment by lsusr · 2020-10-06T04:06:03.253Z · LW(p) · GW(p)
Furthermore, if you seek to contribute to a global cause via technical means then it often makes sense to specialize. If you know you can have a greater marginal impact in biosafety than AGW then you should allocate (almost) all of your altruistic attention to biosafety and (almost) none of it to AGW.
Epistemic status: You asked, so I'm answering, though I'm open to having my mind changed on several details if my assumptions turn out to be wrong. I probably wouldn't have written something like this without prompting. If it's relevant, I'm the author of at least one paper commissioned by the EPA on climate-related concerns.
I don't like the branding of "Fighting Climate Change" and would like to see less of it. The actual goal is providing energy to sustain the survival and flourishing of 7.8+ billion people, fueling a technologically advanced global civilization, while simultaneously reducing the negative externalities of energy generation. In other words, we're faced with a multi-dimensional optimization problem, while the rhetoric of "Fighting Climate Change" almost universally only addresses the last dimension, reducing externalities. Currently 80% of worldwide energy comes from fossil fuels and only 5% comes from renewables. So, simplistically, renewables need to generate 16x as much energy as they do right now. This number is "not so bad" if you assume that technology will continue to develop, putting renewables on an exponential curve, and "pretty bad" if you assume that renewables continue to be implemented at about the current rate.
And we need more energy generating capacity than we have now. A lot more. Current energy generation capacity only really provides a high standard of living for a small percentage of the world population. Everybody wants to lift Africa out of poverty, but nobody seems interested in asking many new power plants that will require. These power plants will be built with whatever technology is cheapest. We cannot dictate policy in power plant construction in the developing world; all we can do is try to make sure that better technologies exist when those plants are built.
I have seen no realistic policy proposal that meaningfully addresses climate change through austerity (voluntary reduced consumption) or increased energy usage efficiency. These sorts of things can help on the margins, but any actual solution will involve technology development. Direct carbon capture is also a possible target for technological breakthrough.
↑ comment by ChristianKl · 2020-10-07T16:11:40.908Z · LW(p) · GW(p)
So, simplistically, renewables need to generate 16x as much energy as they do right now.
You can drown in a river that's on average 1cm deep. The problem is a lot harden then simply producing 16x as much energy with renewables.
Replies from: Jay↑ comment by Jay · 2020-10-14T23:17:54.587Z · LW(p) · GW(p)
One of the major problems with solar is that it's diffuse. The second law of thermodynamics means that diffusing energy is very easy and concentrating it is effortful. When you take a bunch of photocells that are producing milliamps of current at about a volt (i.e. milliwatts of power), the process required to combine their output into a usable voltage and current is rather inefficient. I don't have any recent data for how inefficient; does anyone?
Fossil energy is concentrated from the start. Nuclear energy isn't; turning dilute ores into concentrated fuels takes a good deal of processing (but it still works more scalably than solar).
↑ comment by Jay · 2020-10-07T02:04:29.687Z · LW(p) · GW(p)
We have been working on technological fixes for over 50 years, and we don't have anything that could realistically address the problem to show for it.* We should at least consider the possibility that a technological fix will not be available. **
Humans are often wrong-genre savvy. Most people in the rationalist community seem to think we're in a Star Trek prequel, but we may actually be in a big budget reboot of Decline and Fall of the Roman Empire. For what it's worth, the guy who's playing Caligula is a great performer. Huge talent. The biggest talent ever.
*Someone will inevitably say that we are just about to have a solar revolution. Some of us heard Jimmy Carter say that, and the promise to payoff ratio is getting a bit on the implausible side.
**I once had a job interview that went like this:
Interviewer: After coal is burned, we're looking for a way to turn the carbon dioxide back into coal. Can you do that?
Me (hesitantly): Yes, but it would consume energy.
Interviewer: Energy is available.
Me: I mean, it would consume so much energy that you'd be better off never having burned the coal. That's not really something you can engineer around; that's basic thermodynamics.
A brief pretense of completing the interview was made by both parties.
Replies from: moridinamael, ChristianKl, ChristianKl↑ comment by moridinamael · 2020-10-07T14:43:42.809Z · LW(p) · GW(p)
Not sure that I disagree with you at all on any specific point.
It's just that "Considering the possibility that a technological fix will not be available" actually looks like staring down the barrel of a civilizational gun. There is no clever policy solution that dodges the bullet.
If you impose a large carbon tax, or other effective global policy of austerity that reduces fossil fuel use without replacing that energy somehow, you're just making the whole world poor, as our electricity, food, transportation and medical bills go up above even their currently barely affordable levels, and the growth of the developing world is completely halted, and probably reversed. If your reason for imposing a carbon tax is not "to incentivize tech development" but instead "punish people for using energy", then people will revolt. There were riots in France because of a relatively modest gasoline tax. An actual across-the-board policy implementation of "austerity" in some form would either be repealed quickly, would lead to civilizational collapse and mass death, or both. If you impose a small carbon tax (or some other token gesture at austerity and conservation) it will simply not be adequate to address the issue. It will at best impose a very slight damping on the growth function. This is what I mean when I say there is no practical policy proposal that addresses the problem. It is technology, or death. If you know of a plan that persuasively and quantitatively argues otherwise, I'd love to see it.
Replies from: Jay, greylag↑ comment by Jay · 2020-10-07T22:31:45.418Z · LW(p) · GW(p)
I agree that it's technology or death. I'm just not seeing the necessary technology, or any realistic hope of inventing it. Which is why the comparison I used was the fall of the Roman Empire, which took Western Europe about a thousand years to fully recover from.
You might respond that I should go into renewable energy research to try to solve the problem. I did, for four years. I'm out of ideas.
↑ comment by greylag · 2020-10-07T16:49:37.034Z · LW(p) · GW(p)
If you impose a large carbon tax, or other effective global policy of austerity that reduces fossil fuel use without replacing that energy somehow, you're just making the whole world poor
For the case that our civilisation’s energy efficiency is substantially below optimal, see [Factor 4](https://sustainabilitydictionary.com/2006/02/17/factor-4/) (Lovins & Lovins, 1988)
↑ comment by ChristianKl · 2020-10-07T16:09:35.422Z · LW(p) · GW(p)
We have been working on technological fixes for over 50 years,
No, we did our best to stop the technology developement with a lot of regulation. As a result we stopped working on more efficient nuclear plants likely would have solved out problem if we would have went for them.
Replies from: Jay↑ comment by ChristianKl · 2020-10-07T16:07:17.101Z · LW(p) · GW(p)
Me: I mean, it would consume so much energy that you'd be better off never having burned the coal. That's not really something you can engineer around; that's basic thermodynamics.
That seems like a major misunderstanding of our problem. We basically have the technology to create enough energy via solar cells. Our problem is that we don't have enough energy at the times where the sun doesn't shine. Our problem is energy storage and if you could effectively turn your energy a form that can be stored for longer timeframes, everything would work out.
I believe climate change will provide a significant (>90%) net-negative (>90%) impact on future human welfare in the forseeable future. I believe (>90%) that climate change is a large-scale policy and technology problem i.e. individual self-regulation has more to do with fuzzies than utilons. I have triaged climate change as a less-than-optimal target of my limited resources. Climate change therefore deserves no more further attention from me.
To put it bluntly, I believe
- Strong AGI is potentially imminent.
- If strong AGI is imminent then it outweighs all other large-scale altruistic concerns.
- I personally have a non-negligible probability of impacting the trajectory of AGI.
AGI therefore overwhelms all my other actionable large-scale altruistic concerns. In particular, climate change is a relatively minor (>%75) threat I am unlikely to significantly influence directly via intentional action (>95%). Furthermore, climate change is likely (>75%) to have relatively minor (affecting <10% of my overall material standard of living) negative personal impact on me. Thought climate change is important to human welfare, I ought not to be "concerned" about it at all.
As for mass media's respresentation of climate change, I think it's crap—just like all other propaganda. This is by design [LW · GW].
I'll bite --- as a "not feeling alarmed" sort of person. First, though, I'll clarify that I'm reading"climate change" as shorthand for "climate change that is net-negative for human welfare" (herein CCNNHW), since obviously the climate is in a state of constant change.
Confidence levels expressed as rough probabilities:
0.70 : we are observing CCNNHW
0.80 : current human behavior increases probability of CCNNHW
0.10 : future magnitude of CCNNHW will be massive
0.98 : future human behavior will change, given CCNNHW
0.90 : some current and proposed mitigations are themselves NNHW
0.60 : some proposed mitigations have negative effects rivaling that of CC
0.50 : it's possible to design a net-positive mitigation [1]
0.10 : it's possible to implement a net-positive mitigation [2]
Taken together, I assign higher risk to collective, politically directed efforts to mitigate CC than to CC itself.
---
[1] non-linear feedback effects depress this value
[2] political processes depress this value
Human acitivies contribute to climate change.
Changing to renewable energy is both very expensive as we don't have a good way to store energy and provides systemtic risk because sun and wind are unreliable in many geographies and especially as more of our infrastracture depends on electricity instead of oil an electricity outage of one or two week produces bigger problems.
There were studies that modeled hydropower plants as being able to store a lot of energy and release it when needed but that's not how hydropower plants work. If a hydropower plant releases much more energy then on average in a shorter timeframe it floods the regions further down the river.
The electricity system requires that the amount of energy that gets pulled from the system is equal to the electricity that's put into the system. If that equality breaks down, the system breaks down. We don't have good mechanisms to reduce power consumption, so if not enough energy gets produced we usually have to create full power outages in a region.
There's not enough political will to switch our energy system to either renewable energy or nuclear plants in a timeframe that's enough to prevent undesirable climate changes on it's own. Given that a single actor can implement geoengineering, geoengineering will be used starting between 2030 and 2060 decades to reduce climate impact.
Nitrogen and phosphorus pollution should likely get more attention as it's currently getting as those lifecycles are not working well.
Coal kills many people through direct airpollution. In cities non-electric cars emit both airpollution and noise pollution. That means that it's desireable to switch to electric cars and less coal plants besides climate change concerns.
There's a good chance that the great stagnation is partly caused by the stagnation in energy prices that good cheaper year-by-year before the great stagnation. This means it's very valuable for future technological growth to have cheap energy.
Both AI safety, bio-safety and global peace seem more important cause areas as they have more associated risk then climate change.
I agree with the other answers that say climate change is a big deal and risky and worth a lot of resources and attention, but it’s already getting a lot of resources and attention, and it’s pretty low as an existential threat.
Also, my impression is that there are important facts about how climate change works that are almost never mentioned. For example, this claim that there are diminishing greenhouse effects to CO2: https://wattsupwiththat.com/2014/08/10/the-diminishing-influence-of-increasing-carbon-dioxide-on-temperature/
Also, I think most of the activism I see around climate change is dumb and counterproductive and moralizing, e.g. encouraging personal lifestyle sacrifices.
↑ comment by Thomas Kwa (thomas-kwa) · 2020-10-06T01:22:15.888Z · LW(p) · GW(p)
That link is from a climate change denier, so it is probably taken grossly out of context or something.
Some actual facts I think most people don't know: Sea level rise is caused by melting glaciers + thermal expansion, not melting sea ice (because physics). Warming oceans might cause a decrease in tropical cyclone frequency and increase in intensity (page 163 of this IPCC report).
Replies from: steven0461↑ comment by steven0461 · 2020-10-06T02:04:10.587Z · LW(p) · GW(p)
The link says a lot of things, but the basic claim that greenhouse forcing is logarithmic as a function of concentration is as far as I know completely uncontroversial.
Climate change is obviously real and getting worse. We are seeing the early effects already, and they are straining our emergency measures beyond capacity. Immediate and widespread systemic changes are needed to alter course.
I am powerless to effect such changes.
I suspect that climate change is both overhyped and underhyped.
I expect that the current models underestimate the rate of change, and that the Arctic, permafrost, Greenland and eventually Antarctic will melt much sooner than projected, with the corresponding sea level rise. A lot of formerly livable places will stop being so, whether due to temperature extremes or ending up underwater.
That said, even the highest possible global warming will not exceed what happened 50 million years ago. And that time was actually one of the best for the diversity of life on Earth, and it could be again. What we have now is basically frozen leftovers of what once was.
That said, the scale of the warming is unprecedented, and so a lot of wildlife will not be able to adapt, and will go extinct, only for the new varieties of species to take their habitats.
That said, humans will suffer from various calamities and from forced migration north into livable areas. There will be population pressures that will result in disappearance of the current Arctic states like Russia, Canada and Denmark's Greenland. And this will not happen without a fight, hopefully not a nuclear one, but who knows.
That said, there are plenty of potential technological ways to cool the planet down, and some may end up being implemented, whether unilaterally or consensually. This may happen as a short-term measure until other technologies are used to remove carbon dioxide from the atmosphere.
TL;DR: Climate change is a slow-moving disaster, but not an X-risk.
I am generally concerned, and also think this makes me an outlier. I don't have any specific model of what will happen.
This is a low information belief that could definitely change in the future. However, it doesn't seem important to figure out how dangerous climate change is exactly because doing something about it is definitely not my comparative advantage, and I'm confident that it's less under-prioritized and less important than dangers from AI. It's mostly like, 'well the future of life institute has studied this problem, they don't seem to think we can disregard it as a contributor to existential risk, and they seem like the most reasonable authority to trust here'.
A personal quibble I have is that I've seen people dismiss climate change because they don't think it poses a first-order existential risk. I think this is a confused framing that comes from asking 'is climate change an existential risk?' rather than 'does climate change contribute to existential risk?', which is the correct question because existential risk is a single category [LW · GW]. The answer to the latter question seems to be trivially yes, and the follow-up question is just how much.
↑ comment by habryka (habryka4) · 2020-10-06T19:35:42.458Z · LW(p) · GW(p)
It's mostly like, 'well the future of life institute has studied this problem, they don't seem to think we can disregard it as a contributor to existential risk, and they seem like the most reasonable authority to trust here'.
Woah, yeah, just let it be known that I don't think you should trust FLI with this kind of stuff. They seem to pretty transparently have messed up prioritization in this way a few times, trying to be more appealing to a broader audience, by emphasizing hypotheses that seem intuitively compelling but not actually very likely to be true, with the explicit aim of broadening their reach, as far as I can tell.
Of course, you are free to make your own judgement, but since I think there is a good chance others look at FLI and might think that I (and others) endorse their judgement here since they are kind of affiliated with the community, I want to make it clear that I very concretely don't endorse their judgement on topics like this.
↑ comment by Ben Pace (Benito) · 2020-10-06T07:10:19.674Z · LW(p) · GW(p)
FWIW I don't think the FLI is that reasonable an authority here, I'm not sure why you'd defer to them.
They do a good job coordinating lots of things to happen, but I think their public statements on AI, nukes, climate change, etc, are often pretty confused or are wrong. For example, their focus on lethal autonomous weapons seems confused about the problem we have with AI, focusing on the direct destructive capabilities of AI instead of the alignment problem where we don't understand what decisions they're even making and so cannot in-principle align their intent with our own.
I'm not sure I follow your point about "is" versus "contributes to". I don't think I agree that it doesn't matter whether a particular entity is itself capable of ending civilization. Nanotech, AI, synthetic biology, each have the ability to be powerful enough to end civilization before breakfast. Climate change seems like a major catastrophe but not on the same level, and so while it's still relevant to model over multiple decades, it's not primary in the way the others are.
Replies from: sil-ver↑ comment by Rafael Harth (sil-ver) · 2020-10-06T08:57:54.850Z · LW(p) · GW(p)
I'm not sure I follow your point about "is" versus "contributes to". I don't think I agree that it doesn't matter whether a particular entity is itself capable of ending civilization. Nanotech, AI, synthetic biology, each have the ability to be powerful enough to end civilization before breakfast. Climate change seems like a major catastrophe but not on the same level, and so while it's still relevant to model over multiple decades, it's not primary in the way the others are.
Suppose it is, in fact, the case that climate change contributes 10% to existential risk. (Defined by, if we performed a surgery on the world state right now that found a perfect solution to c/c, existential risk would go down by that much.) Suppose further that only one percentage point of that goes into scenarios where snowball effects lead to earth to becoming so hot that society grinds to a halt, and nine percentage points into scenarios where international tensions lead to an all-out nuclear war and subsequent winter that ends of all humanity. Would you then treat "x-risk by climate change" as 1% or 10%? My point is that it should clearly be 10%, and this answer falls out of the framing I suggest. (Whereas the 'x-risk by or from climate change' phrasing makes it kind of unclear.)
FWIW I don't think the FLI is that reasonable an authority here, I'm not sure why you'd defer to them.
The 'FLI is a reasonable authority' belief is itself fairly low information (low enough to be moved by your comment).
Replies from: Benito, gjm↑ comment by Ben Pace (Benito) · 2020-10-06T17:51:57.027Z · LW(p) · GW(p)
Would you then treat "x-risk by climate change" as 1% or 10%? My point is that it should clearly be 10%
Thanks! Your point is well taken, I'm generally pro being specific and clear in the way that you are being.
However, I have a clever counterargument, which I will now use for devastating effect!
(...not really, but I just realized this is very much like a conversation I had earlier this week.)
I was having a conversation with Critch and Habryka at a LessWrong event, where Critch said he felt people were using the term 'aligned' in a very all-or-nothing way, rather than discussing its subtleties. Critch made the following analogy (I'm recounting as best I can, forgive me if I have misremembered):
- Bob sees a single sketchy looking trial in his country's legal system and say that the justice system is unjust, and should be overthrown.
- Alice replies saying that justice is a fairly subtle category with lots of edge cases and things can be more or less just, and wants Bob to acknowledge all the ways that the justice system is and isn't just rather than using a flat term.
Critch was saying people are being like Bob with the term 'aligned'.
Habryka replied with a different analogy:
- Bob lives in a failed state surrounded by many sham trials and people going to prison for bad reasons. He says that the justice system is unjust, and should be overthrown.
- Alice replies <the same>.
As I see it, in the former conversation Alice is clarifying, and in the latter I feel like Alice is obfuscating.
I think often when I use the term 'x-risk' I feel more like Bob in the second situation, where most people didn't really have it on their radar that this could finish civilization, rather than just be another terrible situation we have to deal with, filled with unnecessary suffering and death. Of the two analogies I feel like we're closer to the second situation, where I'm Bob and you're Alice.
Returning from the analogy, the point is that there are some things that are really x-risks and directly cause human extinction, and there are other things like bad governance structures or cancel culture or climate change that are pretty bad indeed and generally make society much worse off, but are not in the category of extinction risk, and I think it's confusing the category to obfuscate which ones are members of the category and which aren't. In most conversations, I'm trying to use 'x-risk' to distinguish between things that are really bad, and things that have this ability to cause extinction, where previously no distinction was being made.
Replies from: sil-ver↑ comment by Rafael Harth (sil-ver) · 2020-10-06T19:25:30.306Z · LW(p) · GW(p)
The analogy makes sense to me, and I can both see how being Bob on alignment (and many other areas) is a failure mode, and how being Alice in some cases is failure mode.
But I don't think it applies to what I said.
there are some things that are really x-risks and directly cause human extinction, and there are other things like bad governance structures or cancel culture or climate change that are pretty bad indeed and generally make society much worse off
I agree, but I was postulating that climate change increases literal extinction (where everyone dies) by 10%.
The difference between x-risk and catastrophic risk (what I think you're talking about) is not the same as the difference between first and -th order existential risk. As far as I'm concerned, the former is very large because of future generations, but the second is zero. I don't care at all if climate change kills everyone directly or via nuclear war, as long as everyone is dead.
Or was your point just that the two could be conflated?
↑ comment by gjm · 2020-10-06T18:46:37.458Z · LW(p) · GW(p)
It's not clear to me that in the scenario you describe 10% is a better figure to use than 1%.
Presumably the main reason for estimating such figures is to decide (individually or collectively) what to do.
- If 10% of current existential risk is because of the possibility that our greenhouse-gas emissions turn the earth into Venus (and if 10% of current existential risk is a large amount), then the things we might consider doing as a result include campaigning for regulations or incentives that reduce greenhouse gas emissions, searching for technologies that do useful things with less greenhouse gas emission than the ones we currently have, investigating ways of getting those gases out of the atmosphere once they're in, and so forth.
- If 10% of current existential risk is because of the possibility that we have a massive nuclear war and the resulting firestorms fill the atmosphere with particulates that lower temperatures and the remaining humans freeze to death, then the things we might consider doing as a result include campaigning for nuclear disarmament or rearmament, (whichever we think will do more to reduce the likelihood of large-scale nuclear war), finding ways to reduce international tensions generally, researching weapons that are even more directly destructive and have fewer side effects, investigating ways of getting particulates out of the atmosphere after a nuclear war, and so forth.
The actions in these two cases have very little overlap. The first set are mostly concerned with changing how we affect the climate. The second set are mostly concerned with changing whether and how we get into massive wars.
For what actual purpose is there any meaning to adding the two figures together? It seems to me if we're asking "what existential risk arises from climate change?" we are interested in the first type of risk, and the second type wants combining with other kinds of existential risk arising from nuclear war (people killed by the actual nuclear explosions, EMP screwing up electronics we need to keep our civilization going, fallout making things super-dangerous even after the bombs have stopped going off, etc.).
I'm not certain that this analysis is right, but at the very least it seems plausible enough to me that I don't see how it can be that "clearly" we want to use the 10% figure rather than the 1% figure in your scenario.
Replies from: sil-ver↑ comment by Rafael Harth (sil-ver) · 2020-10-06T19:33:56.832Z · LW(p) · GW(p)
If 10% of current existential risk is because of the possibility that we have a massive nuclear war and the resulting firestorms fill the atmosphere with particulates that lower temperatures and the remaining humans freeze to death, then the things we might consider doing as a result include campaigning for nuclear disarmament or rearmament, (whichever we think will do more to reduce the likelihood of large-scale nuclear war), finding ways to reduce international tensions generally, researching weapons that are even more directly destructive and have fewer side effects, investigating ways of getting particulates out of the atmosphere after a nuclear war, and so forth.
In the hypothetical, 9% was the contribution of climate change to nuclear winter, not the total probability of nuclear winter. The total probability for nuclear winter could be 25%.
In that case, if we 'solved' climate change, the probability for nuclear winter would decrease from 25% to 16% (and the probability for first-order extinction from climate change would decrease from 1% to 0%). The total decrease in existential risk would be 10%.
I will grant you that it's not irrelevant where the first-order effect comes from -- if we somehow solved nuclear war entirely, this would make it much less urgent to solve climate change, since now the possible gain is only 1% and not 10%. But it still seems obvious to me that the number you care about when discussing climate change is 10% because as long as we don't magically solve nuclear war, that's the total increase to the one event we care about (i.e., the single category of existential risk).
Replies from: gjm↑ comment by gjm · 2020-10-07T01:32:43.717Z · LW(p) · GW(p)
Ah, OK, I didn't read carefully enough: you specified that somehow "solving" climate change would reduce Pr(extinction due to nuclear winter) by 9%. I agree that in that case you're right. But now that I understand better what scenario you're proposing it seems like a really weird scenario to propose, because I can't imagine what sort of real-world "solution" to climate change would have that property. Maybe the discovery of some sort of weather magic that enables us to adjust weather and climate arbitrarily would do it, but the actual things we might do that would help with climate change are all more specific and limited than that, and e.g. scarcely anything that reduces danger from global warming would help much with nuclear winter.
So I'm not sure how this (to my mind super-improbable) hypothetical scenario, where work on climate change would somehow address nuclear winter along with global warming, tells us anything about the actual world we live in, where surely that wouldn't be the case.
Am I still missing something important?
Replies from: sil-ver↑ comment by Rafael Harth (sil-ver) · 2020-10-07T10:12:43.967Z · LW(p) · GW(p)
But now that I understand better what scenario you're proposing it seems like a really weird scenario to propose, because I can't imagine what sort of real-world "solution" to climate change would have that property. Maybe the discovery of some sort of weather magic that enables us to adjust weather and climate arbitrarily would do it
I think the story of how mitigating climate change reduces risk of first-order effects from nuclear war is not that it helps survive nuclear winter, but that climate change leads to things like refugee crises, which in turn lead to worse international relations and higher chance of nuclear weapons being used, and hence mitigating c/c leads to lower chances of nuclear winter occurring.
The 1%/9% numbers were meant to illustrate the principle and not to be realistic, but if you told me something like, there's a 0.5% contribution to x-risk from c/c via first-order effects, and there's a total of 5% contribution to x-risk from c/c via increased risk from AI, bio-terrorism, and nuclear winter (all of which plausibly suffer from political instabilities), that doesn't sound obviously unreasonable to me.
The concrete claims I'm defending are that
- insofar as they exist, -th order contributions to x-risk matter roughly as much as first-order contributions; and
- it's not obvious that they don't exist or are not negligible.
I think those are all you need to see that the single-category framing is the correct one.
Replies from: gjm↑ comment by gjm · 2020-10-07T16:01:39.374Z · LW(p) · GW(p)
OK, so it turns out I misunderstood your example in two different ways, making (in addition to the error discussed above) the rookie mistake of assuming that when you gave nuclear war leading to nuclear winter (which surely is a variety of anthropogenic climate change) the latter was the "climate change" you meant. Oh well.
So, I do agree that if climate change contributes to existential risk indirectly in that sort of way (but we're still talking about the same kind of climate change as we might worry about the direct effects of) then yes, that should go in the same accounting bucket as the direct effects. Yay, agreement.
(And I think we also agree that cases where other things such as nuclear war produce other kinds of climate change should not go in the same accounting bucket, even though in some sense they involve climate change.)
Replies from: sil-ver↑ comment by Rafael Harth (sil-ver) · 2020-10-07T18:21:06.654Z · LW(p) · GW(p)
So, I do agree that if climate change contributes to existential risk indirectly in that sort of way (but we're still talking about the same kind of climate change as we might worry about the direct effects of) then yes, that should go in the same accounting bucket as the direct effects. Yay, agreement.
(And I think we also agree that cases where other things such as nuclear war produce other kinds of climate change should not go in the same accounting bucket, even though in some sense they involve climate change.)
Yes on both.
This conversation is sort of interesting on a meta level. Turns out there were two ways my example was confusing, and neither of them occurred to me when I wrote it. Apologies for that.
I'm not sure if there's a lesson here. Maybe something like 'the difficulty of communicating something isn't strictly tied to how simple the point seems to you' (because this was kind of the issue; I thought what I was saying was simple hence easy to understand hence there was no need to think much about what examples to use). Or maybe just always think for a minimum amount of time since one tends to underestimate the difficulty of conversation in general. In retrospect, it sure seems stupid to use nuclear winter as an example for a second-order effect of climate change, when the fact that winter and climate are connected is totally coincidental.
It's somewhat consoling that we at least managed to resolve one misunderstanding per back-and-forth message pair.
- Climate change is only negative insofar as it causes negative change in human welfare.
- Human welfare in this framework is a function of natural environment (which includes climate) and all improvements added to this environment by human effort (e.g. roads, houses, electricity network etc.), which I will refer to as "capital".
- If the climate of an area changes to one that is better suited for human welfare (e.g. allows better crop yields, lessens the need of energy consumption for heating or cooling etc.), climate change has a positive effect.
- As capital needs to be replaced over a time period (e.g., infrastructure has to be repaired and maintained, which can be measured in a ratio of cost of maintenance/cost of production, lets call this "replacement rate"), climate change will have a negative effect on human welfare if the change causes the replacement rate to increase (e.g., if expenses on air conditioning rise faster due to climate change then they drop on insulation from cold).
- For advanced civilizations climate change is inevitable.
- One cannot drain energy from a system without affecting it.
Hence, the higher we are on the Kardashev scale the larger the impact of our energy consumption on our environment. - Even if we change to other energy sources, the environment will still be affected. I do not see serious research into this. Even worse, it seems most people have the illusion that other energy sources might have zero effect on the environment. Large dams already show otherwise for hydroelectric, but it is not so clear on other sources.
As a thought experiment: imagine we are living in a world where the ratio of fossil fuel and wind energy usage is exactly the opposite. As CO2 emission is 1-2% of our world's, we would not be able to find negative effect from this on climate. To me it is plausible that that is the same case with other sources of energy.
- One cannot drain energy from a system without affecting it.
- We are bad at figuring out what climate will do in the future and what how consumption affects it.
- I am not very familiar with contemporary publications on this, but I am quite sceptical about our ability to make accurate predictions, especially as it is the local climate that mostly affects human welfare, global average temperature is a very weak estimation for this.
- In case of human consumption, all supply chain through the whole product lifecycle must be mapped if we desire an accurate top-down solution. I do not see this in the proposed solutions. I have the impression they are only dealing with CO2 production during operation, ignoring production and decomission and all other negative externalities.
- The climate change issue is a discussion of an externality problem with weak understanding of causal effects and very large number of participants.
- There is a classic economics example:
Two firms located on a river. Upstream firm pollutes the river, reducing output for the downstream firm. - To modify this to climate change: replace the river with an ocean, increase the number of actors to 8 billion, allowing them to create non-fixed sets (e.g., companies, towns, families), have them all affect each other in a very small way, which if summed up changes the pollutedness in a specific direction but which still increase production is some beaches of the ocean, but we do not know to what extent exactly. (and here we haven't even elaborated on different jurisdictions).
- There is a classic economics example:
As per above, it is a difficult question. However, even if we found a good solution, the issue has become so politicised that carrying out any plan without massive disruption by interest groups is unavoidable.
When I began writing this, I thought very little good could be done by working on climate change, since of how popular the topic is. But as I wrote, and thought about the issue, I realized that you have a point, and that working on effective solutions to the problem has a high chance of being effective, if not particularly suited for me. I would enjoy seeing more in-depth analyses which do actual research, and attach numbers to the vague feelings of importance I express here.
Using EA's usual decision matrix of Scale, Solvability, and Neglectedness :
Neglectedness, at first glance, seems very low. For the past 20 years there's been a huge media campaign to get people to "solve" climate change, and everyone's aware of it. However, very little effort is expended working & advocating for effective solutions to the problem (ie helping developing countries prepare), and much of the effort seems to be going to low-Solvability & Scale tasks such as attempting to prevent carbon emissions. Thus, despite near-constant media attention, it seems likely that effective solutions are actually very Neglected.
Scale seems pretty large. Those hit hardest will be the people with the least ability to invest in mitigation technologies, and most reliance on nature. Aka: developing countries. Thus lifting developing countries out of their poverty will be much harder in the near-term future. Notably, this poses little risk to the long-term flourishing of the human race, whereas other global catastrophic risks such as dangerous AI, nuclear war, biological war, etc. seem to have both a higher Scale, and higher Neglectedness.
Solvability seems like it'd range from insurmountably low to medium-high, depending on what you choose to focus on. Many of the problems that affect more affluent nations seem like they'd be solved through mitigating technologies, and not through reversing climate change's effects. Things like dams and levees are technologies we already have, and things that the Dutch (note: I looked that up, so I could provide a source, but I knew it was a thing already from an Environmental Science course I took during high school) already use to keep their cities above sea-level. I would bet there are other, similarly low-hanging technologies which would vastly lower the effects of climate change on developing countries. These developing countries would likely develop and implement these technologies once effects from climate change are seen, regardless of what they believe the cause of such climate change is.
Increases in resources here though, seem like they'd have little impact on the outcome for these developing countries. Since there is a large incentive for cities and companies to make and invest in these technologies, they will likely be developed regardless of what interventions are worked on.
By my understanding, even if we stopped all of our carbon output immediately, there'd still be a devastating 2C increase in the average temperature of the earth. And developing countries would be at a great disadvantage developing the infrastructure needed to mitigate it's effects, so the Solvability here is incredibly low.
Thus the goal of "fighting" climate change should focus on providing developing countries the infrastructure they need to be prepared. This doesn't seem like particularly interesting work to me, nor particularly suited to my skills when compared to other ways of improving the world. However, I'd need more knowledge about the effects and the current effective interventions to be confident in my conclusions. Currently, counter to what I thought before writing this, the field seems promising.
↑ comment by Steven Byrnes (steve2152) · 2020-10-07T12:40:23.163Z · LW(p) · GW(p)
Developing technologies and best practices for enabling people to quickly adapt farming practices to a different local environment (rainfall, temperature, etc), including education and outreach and possibly switching crops (or generically engineering / breeding new varieties) along with associated farming tools and know-how and distribution and storage systems etc., would seem helpful for mitigating the damage of not only climate change but also nuclear winter / volcanic winter. While this seems very hard to do completely, it seems feasible to make progress on the margin. I haven't heard of anyone working on that (except ALLFED arguably) but haven't really looked.
I have noticed that work on adapting to climate change is sometimes regarded as taboo, I guess on the theory that it will undermine people's motivation to reduce carbon emissions. I don't believe that theory, not at all, but I gather that some people do. Admittedly it's hard to be 100% sure.
↑ comment by steven0461 · 2020-10-07T17:29:08.834Z · LW(p) · GW(p)
By my understanding, even if we stopped all of our carbon output immediately, there'd still be a devastating 2C increase in the average temperature of the earth.
I don't think this is true:
According to an analysis featured in the recent IPCC special report on 1.5C, reducing all human emissions of greenhouse gases and aerosols to zero immediately would result in a modest short-term bump in global temperatures of around 0.15C as Earth-cooling aerosols disappear, followed by a decline. Around 20 years after emissions went to zero, global temperatures would fall back down below today’s levels and then cool by around 0.25C by 2100.
I.e., if we're at +1.2C today, the maximum would be +1.35C.
↑ comment by ChristianKl · 2020-10-08T01:20:43.391Z · LW(p) · GW(p)
This analysis assumes that we won't do geoengineering. If we do geoengineering to keep temperatures from increasing too much over the present point all the spending on mitigation is wasted.
Replies from: jake-heiser↑ comment by descent (jake-heiser) · 2020-10-14T07:26:55.178Z · LW(p) · GW(p)
This analysis assumes that we will succeed in geoengineering without further deleterious externalities, which has less than no current basis
Replies from: ChristianKl↑ comment by ChristianKl · 2020-10-14T13:35:31.460Z · LW(p) · GW(p)
No, spending on adapting a country to be able to handle +2C warming doesn't help you with random deleterious externalities.
Replies from: jake-heiser↑ comment by descent (jake-heiser) · 2020-10-14T21:19:53.767Z · LW(p) · GW(p)
I agree that responsible policy is preferable to ecosystem stress testing
Replies from: ChristianKl↑ comment by ChristianKl · 2020-10-14T21:53:04.684Z · LW(p) · GW(p)
It seems like you ignore what the above exchange was about.
I am greatly concerned about the risks associated with climate change and have been for several years now, though earlier in my adult life I didn't know much about it and gave too much credence to skeptics such as Bjorn Lomborg. I anticipate that (barring some kind of singularity that makes a mockery of all prediction) the greatest harms from climate change this century will come from mass displacement and migration ("climate refugees"); indeed already there are folks talking about leaving California to escape the ever-growing annual fire seasons. The same will happen (is happening) for those along flooding coastlines or increasingly drought-stricken or fish-depleted regions. Also important to consider are tail-risks, the small but non-negligible possibility that actual warming turns out rather higher than the (already bad!) average-case predictions (see Martin Weitzmann's work, or David Wallace-Wells's famous NY Mag article "The Uninhabitable Earth").
If the recent hype from MIT about nuclear fusion is for real, maybe we can all breathe a sigh of great relief—it could turn out to be some of the best, and most significant, news of the century. We should have been building out old-fashioned nuclear power for decades now, but we are civilizationally inadequate to this sort of basic collective foresight and action. Other high-value actions include modernizing the electrical grid and increasing by orders of magnitude funding for basic research in clean energy, and of course a hefty carbon tax, for Christ's sake (civilizational inadequacies abound!). Geoengineering should be a last resort, since messing with the world's atmospheric/oceanic systems is what got us into this mess in the first place. They are complex nonlinear systems that we literally rely on being relatively stable for the continued existence of humanity; screwing up geoengineering, like screwing up artificial superintelligence, could be the last mistake our species makes.
Just writing out some current beliefs in steam of consciousness. Percentages in parentheses are confidence levels.
Global warming and climate change are happening, and we're well on track to pass 2.5 C total rise. The best way to mitigate this is to reduce fossil fuel use approximately yesterday, and cutting other GHG emissions (85%). The second-most-important thing to do is to adapt to the changes, and trying to sequester carbon or turn back the clock by other means is lower priority than that (60%).
The most camera-friendly impact felt by the developed world will be sea level rise, but I think the biggest problem will be drought and shifting climate patterns in the developing world (70%). A lot of people are going to die or be left in precarious situations (gut estimate: 300M displaced over the next 80 years). I think cataclysmic scenarios such as runaway greenhouse effect, releasing atmosphere-changing amounts of methane hydrates, or melting the Greenland ice sheet are relatively unlikely and not the main problem even after weighting them by importance (75%).
I am concerned about climate change, and I believe we need to have serious changes in the way our world works. We need to be more sustainability-minded in terms of economic growth, for example -- infinite growth is not reasonable. And we seriously need to switch to clean energy sources. The inflation reduction act gives me a lot of hope for this. It's ridiculous how subsidized fossil fuels are. We're dumping money into destroying ourselves when we should be investing in reversing the damage. I'm concerned, but I have to be hopeful -- otherwise I'd be constantly depressed. I don't like the word "alarmed" because I am fighting for a better future by donating, voting, protesting, etc. and "alarmed" sounds like panic, which isn't helpful. I try not to think about climate change too often or else I'd feel totally stuck. Instead I generally try to stay up-to-date on how to help environmentalist movements, try to stay up-to-date on policies and science, etc.
I'm very concerned about climate change having a large negative impact. It seems unlikely to be threatening systemic collapse, but a lot of unnecessary suffering and a slowdown of long-run progress seems likely. Some small risk of extreme scenarios also seems to exist.
My view is that it's not a technological problem but a political one: it would be easy to solve with a global governing body. We have nuclear technology for power, and temporarily reduced availability of vehicle fuel seems to be a small problem that we could easily adapt to. The standard solution of taxing negative externalities should work just as well for this as for other things.
Because of our inept institutions, I believe the only likely solution, apart from accepting a dramatic adaptation with huge biodiversity loss, is that the damages motivate a coalition of major powers to strike a deal and use economic or political leverage to force everyone else into it. Either China and the US change their minds and the EU agrees happily, or India threatens unilateral solar radiation management. Both scenarios seem a couple of decades away.
While the issue is important and interesting I'm quite pessimistic about getting a timely solution, and about the possibility of individuals to make a difference. I still believe most people in a hundred years will have a standard of living similar to Western people now, but lots of suffering will come between now and then.
26 comments
Comments sorted by top scores.
comment by lincolnquirk · 2020-10-05T12:13:59.013Z · LW(p) · GW(p)
My position is similar to that of 80000 hours: it seems like a super high impact cause, vying for the top with AI risk, pandemic risk, global poverty, and maybe 1 or 2 others. But is far more widely recognized and worked-on than those other causes. Enough so that it doesn’t seem like the marginal thing I can do is interesting compared to other problems I could work on.
My models for how to work on it if I did decide to work on it: 1) technology - we should have technology that solves the problem if widely enough deployed. I think we are basically there with nuclear and solar PV+energy storage, so I would probably only spend 10% or so of time getting up to speed on the technology before focusing on
- Policy - we need to convince people to deploy the technology. This seems bigger and harder than the technology one, because of two reasons: a) society’s nuclear blind spot and b) the short-term interests of oil companies and the like who are powerful opposition to any policy which would hurt them in the short run regardless of long term societal outcome.
I don’t have a clear policy agenda but it seems like some combination of carbon tax, investment in PV, and nuclear is the right way to go. I currently would expect that work on the nuclear blind spot would be the most leveraged thing. The reason we have a blind spot seems to be the work of environmentalists from the 70s. As long as we could get them to flip, that could propagate through society in a useful way.
Replies from: TheMajor↑ comment by TheMajor · 2020-10-05T14:08:05.152Z · LW(p) · GW(p)
I completely agree, and would like to add that I personally draw a clear line between "the importance of climate change" and "the importance of me working on/worrying about climate change". All the arguments and evidence I've seen so far suggest solutions that are technological, social(/legal), or some combination of both. I have very little influence on any of these, and they are certainly not my comparative advantage.
If OP has a scheme where my time can be leveraged to have a large (or, at least, more than likely cost-effective) impact on climate change then this scheme would instantly be near the top of my priorities. But as it stands my main options are mostly symbolic.
As an aside, and also to engage with lincoln's points, I am highly sceptical of proposed solutions that require overhauls in policy and public attitude. These may or may not be the way forward, but my personal ability to tip the scales on these matters are slim to none. Wishing for societal change to suit any plans is just that, a wish.
Replies from: andrea-mulazzani, lincolnquirk↑ comment by Emiya (andrea-mulazzani) · 2020-10-05T14:53:46.535Z · LW(p) · GW(p)
I'm trying to reply as little as possible to the comments of this post to avoid influencing the future replies I'll get, but in this case I felt that it was better to do so, since this point is likely an important one to determine the interest users will have for this subject, and consequently to determine how many replies I'll have.
I'm aware that it wouldn't be very useful to make a post exclusively aimed at making the users of this site feel more worried about climate change.
What the individual users of this site can do about it, considering the cost-effectiveness of the possible actions, will be treated extensively in the post I'm planning on the subject. I'd rather not try to summarise them here because I couldn't explain them effectively.
If anyone reading this comment has the same opinions of Major, please write them so anyway.
↑ comment by lincolnquirk · 2020-10-06T00:45:59.150Z · LW(p) · GW(p)
Regarding one’s ability to effect social change: It seems like the standard arguments about small-probability, high-impact paths apply. I think a lot of STEM types tend to default to shy away from policy change, not because of comparative advantage (which would often be a good reason) but because of some blind spot in the way technologists talk about how to get things done in society. I think for historical reasons (the way the rationality community has grown) we tend to be biased towards technical solutions and away from policy ones.
Replies from: TheMajor↑ comment by TheMajor · 2020-10-06T08:19:48.086Z · LW(p) · GW(p)
I definitely agree that there is a bias in this community for technological solutions over policy solutions. However, I don't think that this bias is the deciding factor for judging 'trying to induce policy solutions on climate change' to not be cost-effective. You (and others) already said it best: climate change is far more widely recognised than other topics, with a lot of people already contributing. This topic is quite heavily politicized, and it is very difficult to distinguish "I think this policy would, despite the high costs, be a great benefit to humanity as a whole" from "Go go climate change team! This is a serious issue! Look at me being serious!".
Which reminds me: I think the standard counter-argument to applying the "low probability, high impact" argument to political situations applies: how can you be sure that you're backing the right side, or that your call to action won't be met with an equal call to opposite action by your political opponents? I'm not that eager to have an in-depth discussion on this in the comments here (especially since we don't actually have a policy proposal or a method to implement it), but one of the main reasons I am hesitant about policy proposals is the significant chance for large negative externalities, and the strong motivation of the proposers to downplay those.
Emiya said cost-effectiveness will be treated extensively, and I am extremely eager to read the full post. As I said above, if there is a cost-effective way for me to combat climate change this would jump to (near) the top of my priorities instantly.
comment by Vladimir_Nesov · 2020-10-05T18:26:08.981Z · LW(p) · GW(p)
My impression of the consensus is that at the scale of human civilization, climate change is expected to slowly inflict significant (but not global catastrophic) economic and humanitarian damage, sometimes forcing whole regions to change how they do things, and that it's cost-effective to coordinate while it's not entirely too late to reduce this damage by reducing climate change. Many people know this (it's not a neglected cause), so additional personal effort won't meaningfully affect things. There is essentially no direct existential risk. Is this a mistaken impression (about the consensus, not about the issue itself), are there more significant aspects?
So I'm not at all concerned about this. In the sense that being justifiably concerned is to expect that additional effort or study in this direction is one of the most valuable things I should be doing.
comment by Dagon · 2020-10-05T16:37:54.793Z · LW(p) · GW(p)
First, upvotes and kudos for asking about current attitudes and opinions before diving into specifics and explanation/exhortation on the topic. This is awesome - well done!
My general opinion is that this topic is too politicized to be a great fit for LessWrong. Objective modeling of climate change and making predictions might be OK, but somewhere before you get to "mass media attitude", you've crossed the line into talking about outgroup errors and things other than truth-seeking in one's own beliefs. Even when focusing on predictions and truth (for which other people's actions is definitely in scope), this is hard-mode discussion, and likely to derail by confusing positive with normative elements of the analyses.
I'd keep it off of LW, or perhaps have a linkpost and see the reaction - maybe I'm wrong.
My personal opinion: climate change (and more directly, conflict caused or exacerbated by it) is the single biggest risk to human-like intelligence flourishing in the galaxy - very likely that it's a large component of the Great Filter. And it's caused by such deep human drives (procreation and scope-insensitive caring about our young in the short-term) that it's probably inevitable - any additional efficiency or sustainability we undertake will get used up by making more people. I'd like to see more focus on how to get truly self-sufficient Mars (and asteroid/moons) colonies of at least 100K people with a clear upward slope, and on how to get at least 0.5B people to survive the collapse of earth, with enough knowledge and resources that the dark age lasts less than 300 years. I don't currently see a path to either, nor to a reduction in human population and resource usage that doesn't include more destruction in war than it's worth.
Replies from: niplav↑ comment by niplav · 2020-10-05T22:05:43.930Z · LW(p) · GW(p)
My personal opinion: climate change (and more directly, conflict caused or exacerbated by it) is the single biggest risk to human-like intelligence flourishing in the galaxy - very likely that it's a large component of the Great Filter.
I don't think that the idea of the Great Filter fits very well here. The Great Filter would be something so universal that it eliminates ~100% of all civilizations. Climate change seems to be conditional on a number of factors specific to earth, e.g. carbon-based life, green-house gas effects, interdependent civilization etc., that it doesn't really work well as a factor that eliminates nearly all civilizations at a specific level of development.
Replies from: Dagon↑ comment by Dagon · 2020-10-05T23:20:37.591Z · LW(p) · GW(p)
My suspicion is that it generalizes well beyond mechanisms of greenhouse gasses or temperature ranges. The path from "able to manipulate a civilization's environment at scale" to "able to modulate use of resources in order not to destroy said civilization", with an added element of "over-optimization for a given environment rendering a nascent civilization extremely vulnerable to changes in their environment" could easily be universal problems.
It's the fragility that worries me most - I believe that if we could remain calm and coordinate the application of mitigations, we could make it through most of the projected changes. But I don't believe that we CAN remain calm - I suspect (and fear) that humans will react violently to any significant future changes, and our civilization will turn out to be much much easier to destroy than to maintain.
Regardless of whether it's universal, that's the x-risk I see to our brand of human-like intelligent experiences. Not climate change directly, but war and destruction about how to slow it down, and over who gets the remaining nice bits as it gets worse.
Replies from: niplavcomment by Jay · 2020-10-05T11:10:01.612Z · LW(p) · GW(p)
My personal views on climate change are extremely heterodox among the rationalist community*, but not uncommon among intellectuals of other stripes:
- Climate change is real, and is exceeding what were previously considered worst case scenarios.
- It was kicked off by fossil fuels, but at this point methane clathrates may be a bigger contributor.
- It is having substantial effects on wildfires, hurricanes, and droughts. It will likely start having substantial impacts on commercial agriculture soon. That will probably eventually kill a lot of people. Like, more than half.
- When the ecological situation deteriorates that badly, peace is implausible.
- There's not much we can do about it. Renewable energy doesn't have the necessary EROEI** to sustain society. Nuclear has more potential, but the political problems are showstoppers and rare earths are, well, rare. About 3/4 of our energy comes from fossil fuels and there's little hope of that changing soon.
Since this thread is a poll, it should go without saying that reasonable people disagree. But I said it anyway.
*The "rationalist community" is centered around Silly Con Valley and tends to be credulous about the potential of technology.
**Energy return on energy invested. A recent solar plant in Spain managed an EROEI of roughly 3. An EROEI of 12 is thought to be sufficient to support a stripped-down, efficiency-oriented, zero-growth version of civilization as we know it. In the 1960s, oil wells with EROEIs of thousands were available; they're all but gone now.
Replies from: MrMind, greylag↑ comment by MrMind · 2020-10-05T15:47:11.386Z · LW(p) · GW(p)
Well, I share the majority of your points. I think that in 30 years millions of people will try to relocate in more fertile areas. And I think that not even the firing of the clathrate gun will force humans to coordinate globally. Although I am a bit more optimist about technology, the actual status quo is broken beyond repair
↑ comment by greylag · 2020-10-07T17:01:15.019Z · LW(p) · GW(p)
I’m surprised at these EROI figures: that solar PV is producing energy at very low levalised cost but utterly pathetic EROEI fails the sniff test. A quick scoot through Wikipedia finds a methodological argument (comments on https://www.sciencedirect.com/science/article/abs/pii/S0360544213000492?via%3Dihub).
Replies from: Jay↑ comment by Jay · 2020-10-07T22:17:27.430Z · LW(p) · GW(p)
Part of it is that high-performance solar cells require single-crystal silicon or gallium arsenide. The purification process for semiconductors is extremely energy intensive. The device fabrication processes are resource and energy intensive as well. But yes, storage is also a huge problem (especially for winter heating, etc.)
comment by steven0461 · 2020-10-06T01:48:49.810Z · LW(p) · GW(p)
I suspect this is one of those cases where the truth is (moderately) outside the Overton window and forcing people to spell this out has the potential to cause great harm to the rationalist community.
Replies from: andrea-mulazzani↑ comment by Emiya (andrea-mulazzani) · 2020-10-06T08:49:42.972Z · LW(p) · GW(p)
Can I ask you more information on why do you think this could cause great harm to the community? I'd really rather not cause that and I'm relatively new here, so I might be wrong in my expectations on how people could react.
Replies from: Vladimir_Nesov, steven0461↑ comment by Vladimir_Nesov · 2020-10-06T13:03:04.152Z · LW(p) · GW(p)
If we add expected technological change to the picture, climate change starts mattering even less. It's plausible that this kind of conclusion, when presented in sufficient detail as the position of the forum, can then be framed as some sort of climate change denialism or an attitude of gross irresponsibility. If the position on this topic is spelled out, it presents an opportunity for an attack that would otherwise be unavailable.
The alternative of only saying more politically correct things is even worse, encourages motivated reasoning. To some extent my own comment [LW(p) · GW(p)] is already a victim (and perpetrator) of this problem, as the potential for such an attack motivated me to avoid mentioning that climate change matters much less than otherwise because of AGI timelines, where with sufficient AGI-accelerated technological progress the whole issue becomes completely irrelevant in a reasonable time, unless we get some civilization-shattering catastrophy that delays AGI by more than decades (compared to a no-catastrophy scenario), in which case climate change would also be the least of our problems. So I chose to talk about the consensus and not the phenomenon itself, as AGI timeline considerations are not part of the standard discussion on the topic.
These arguments are not needed for my thesis that people who are not already working on climate change shouldn't be concerned about it. And regulating climate change is still the cost-effective thing to do (probably), similarly to buying more of the things that are on a sale from a grocery store, no reason to abandon that activity. But the above point is a vital component of my understanding of seriousness of climate change as a civilizational concern, making it another order of magnitude less important than it would be otherwise.
Replies from: andrea-mulazzani↑ comment by Emiya (andrea-mulazzani) · 2020-10-06T15:04:33.334Z · LW(p) · GW(p)
It isn't at all my intention to frame the position of the forum as one of gross irresponsibility, or to use the replies I'll get to present the forum's position as one which is pro climate change denialism (either in the sense that climate change isn't happening, that it won't be harmful, or that it shouldn't be avoided).
I also won't try to censor my post by including only statements that would be uncontroversial in a laymen discussion (I don't like to use politically correct with that meaning), I believe this is one of the few sites where one can be both polite and accurate in his statements and also be perceived as so.
If you were instead worried that my question, the replies it got, or my planned future post, could be used by someone to attack the site or its users, I'd like to know more about it.
If it seems like a real risk, I'd take countermeasures such as avoid stating what the users beliefs are in my future post (NOTE: I'm not planning to link any beliefs I'd talk about to any specific users, my current plan is just to address the common beliefs about the subject and try to provide good informations and analysis about them) and prevent people from commenting on it. If what's been already said could already be a likely source of damage, I could try to find ways to sink or delete this question and the replies I got.
So far the greatest potential risk I see is to create a toxic discussion between the members of this community.
I don't want that to happen, of course, but feel that if the different positions aren't explained and if any eventual errors in them aren't corrected, toxic discussions could form every time related topics will be mentioned in the future in any post.
In another discussion that touched related topics, I wrote at least two comments that I still endorse and think have correct information and reasoning, but that I realised were unnecessarily angry in tone. Even worse, I realised my brain had switched on it's "politic debate" mode as I was writing a third one. All around, the discussion felt to me as being remarkably more similar to the kind of discussion one can see on an average site rather than the level of discussion I usually see here, and I believe that an important part of that is that there wasn't a diffused attempt to understand why people had different beliefs about the subject, and to figure out where the mistakes were.
Replies from: Vladimir_Nesov↑ comment by Vladimir_Nesov · 2020-10-06T15:47:22.280Z · LW(p) · GW(p)
The risk is of inciting a discussion that's easy to exploit for a demagogue (whether they participate in the discussion or quote it long after the fact). You don't have to personally be the demagogue, though many people get their inner demagogues awakened by an appropriate choice of topic. This indirectly creates a risk of motivated reasoning to self-censor the vulnerable aspects of the discussion. There's also a conflict about acceptability and harm of self-censoring of any kind, though discussing this at a sufficient level of abstraction might be valuable.
My reply in the grandparent is half-motivated by noticing that I probably self-censored too much in the original comment on this post. When it's just my own comment, however noncontroversial I expect its thesis to be, it's not yet particularly exploitable. If it eventually turns into a recurring topic of discussion with well-written highly upvoted posts, or ideologically charged highly debated posts, that might be a problem.
(To be clear, I don't know if the concern I'm discussing is close to what steven0461 was alluding to. Both the relevant aspect of truth and the harm that its discussion could cause might be different.)
Replies from: andrea-mulazzani↑ comment by Emiya (andrea-mulazzani) · 2020-10-06T17:01:34.079Z · LW(p) · GW(p)
I see. My current aim is to provide knowledge and reasoning that would actually lower the chances of such discussions happening, moving the subject of climate change away from ideology and political opinions.
I'll try to think of ways to further reduce the likelihood of exploitable discussions and demagoguing happening in my post. Knowing what I plan to write, I don't think such discussions would easily be created even if I didn't, though.
For my attempt ending up as increasing the likelihood of future posts and that leading to harmful discussions... I think it would require people being so determined in arguing about this and ignoring all the points I'd try to make that the current lack of posts on the subject wouldn't serve as a sufficient barrier to stop them from arguing about it now.
Lastly, the site seems to me as having been designed with very effective barriers about such things spiralling out of of control enough to make not trivial damage, though, since you have been on this site from a lot longer than me, I feel like I should value your intuition on the subject more than mine.
All considered, it feels to me that if I consider the risks in leaving the situation as it is and the benefits good reasoning on the subject could provide, what I should do is write my post and try to minimise the chances of the discussion on that turning out badly.
Replies from: Vladimir_Nesov↑ comment by Vladimir_Nesov · 2020-10-06T17:23:17.489Z · LW(p) · GW(p)
I'm not claiming that this is a likely scenario (it might be, but that's not the point). It's about the meaning, not the truth [LW(p) · GW(p)]. The question is what kind of hazards specifically steven0461 might be referring to, regardless of whether such hazards are likely to occur in reality ("has the potential to cause great harm" is also not a claim about high credence, only about great harm).
Personally I feel the forum finds the topic uninteresting, so that it's hard to spark a continuing discussion, even if someone decides to write a lot of good posts on it. I also don't expect a nontrivial amount of toxic debate. But that's the nature of risks, they can be a cause for concern even when unlikely to materialize.
↑ comment by steven0461 · 2020-10-06T17:27:35.304Z · LW(p) · GW(p)
I mostly agree with Vladimir's comments. My wording may have been over-dramatic. I've been fascinated with these topics and have thought and read a lot about them, and my conclusions have been mostly in the direction of not feeling as much concern, but I think if a narrative like that became legibly a "rationalist meme" like how the many worlds interpretation of quantum mechanics is a "rationalist meme", it could be strategically quite harmful, and at any rate I don't care about it as a subject of activism. On the other hand, I don't want people to be wrong. I've been going back and forth on whether to write a Megapost, but I also have the thing where writing multiple sentences is like pulling teeth; let me know if you have a solution to that one.
Replies from: andrea-mulazzani↑ comment by Emiya (andrea-mulazzani) · 2020-10-06T19:05:04.412Z · LW(p) · GW(p)
I agree on your evaluation on the strategic harm this meme would cause if spread. I will have to be careful to not spread this narrative when I write about this subject, it's not a risk I had already considered.
The likelihood of this narrative spreading doesn't feel to me as lesser if I don't write my post or if I didn't wrote this question, though.
I posted this question specifically because I had noticed in several occasions comments that would support that narrative, especially if taken out of contest, also because they weren't throughly explaining the thought processes behind them, and I think I've saw a number of conversations below average for users of this site, so others could get that idea as well. But after hearing the reasonings in these comments "not caring about climate change" is not how I see the viewpoint of the community anymore and I have a model of it that's a lot less negative (in a sense of the utility values I assign to it). I still feel like I can provide an improvement, though.
I'd be very interested in knowing how to not write a Megapost half the times I comment, instead. I can't help but to obsess over having been enough explicative or precise, writing this took me thirty-eight minutes by the clock.
comment by niplav · 2020-10-05T22:18:02.146Z · LW(p) · GW(p)
I would really like to see estimates of the cost of climate change, along with their probabilities. A paper I found is this one, but it is not quite up to the standard I have. It states that 1 bio. humans will die counterfactually due to climate change.
Also, the probabilities given for human extinction from climate change are quite low in comparison to other risks (8% for 10% human population decline from climate disaster conditional on such decline occuring, 1% (probably even less) on 95% decline till 2100).
Current belief: Something (nuclear war, biological catastrophe, unaligned AI, something else entirely) will either get humanity before climate change does (27%), humanity gets through and grows insanely quickly (Roodman 2020) (33%), neither happens & basically status quo with more droughts, famines, poverty, small scale wars etc. due to climate change, which cause several hundred million/few billion deaths over next centuries, but don't let humanity go extinct (17%), something else entirely (23%) (scenarios fitting very loosely onto reality, probabilities are intuition after calibration training).
Replies from: Jay