What risks concern you which don't seem to have been seriously considered by the community?
post by plex (ete) · 2020-10-28T18:27:39.066Z · LW · GW · 2 commentsThis is a question post.
Contents
Answers 11 ete 9 erniebornheimer 8 Daniel Kokotajlo 7 Richard Korzekwa 5 waveman 5 steven0461 4 plutz 3 Emiya 3 ChristianKl 3 frontier64 2 ChristianKl None 2 comments
There are a few things I'm worried about which I have not seen discussed much, and it makes me wonder what we're collectively missing.
This seems like a question which has likely been asked before, but my Google-fu did not find it.
You don't need to make a watertight case for something being important in order to share your concern, brief explanations are fine if the high bar of writing something detailed would put you off posting.
Answers
I'm somewhat worried about this virus-immune bacterium outcompeting regular life because it can drop all the anti-viral adaptations.
It's a conceptually simple find/replace on functionally identical codons, which should make the bacterium immune to all viruses barring something like 60000 specific viral mutations happening at once.
Viruses cause massive selection pressure:
"The rate of viral infection in the oceans stands at 1 × 10^23 infections per second, and these infections remove 20–40% of all bacterial cells each day." - https://www.nature.com/articles/nrmicro2644 (could not find good figures for land, plausibly they are a fair bit lower, but still likely high enough to be a huge deal)
without them I expect evolution to be able to come up with all sorts of ways to make it much better at all the other parts of life.
A big part of my model is that a large part of the reason we have species diversity is that the more successful a species is the bigger a target it is for viral infection, removing that feedback loop entirely while at the same time giving a particular species a huge boost by letting it drop all sorts of systems developed to prevent viruses killing it seems very risky.
This is fundamentally different from anything evolution has ever or could reasonably cook up, since it removes a set of the basic codons in a way which requires foresight (the replacement of each of the huge number of low-use codons has no value independently, and the removal of the ability to process those low-use codons (i.e. removing the relevant tRNA) is reliably fatal before all the instances are replaced).
To clarify: I don't think this is an x-risk, but it could wreak the biosphere in a way which would cause all sorts of problems.
They do claim to be trying to avoid it being able to survive in the wild:
For safety, they also changed genes to make the bacterium dependent on a synthetic amino acid supplied in its nutrient broth. That synthetic molecule does not exist in nature, so the bacterium would die if it ever escaped the lab.
Which is only mildly reassuring if this thing is going to be grown at scale, as the article suggests, since there is (I think?) potential for that modification to be reversed by mutation, given enough attempts and the fact that a modification that makes the cell die if it does not run into a certain amino acid seems like it should be selected against if that amino acid is ever scarce.
↑ comment by plex (ete) · 2022-05-18T22:54:01.492Z · LW(p) · GW(p)
Followed up the containment procedure, and the tests seem inadequate to bet the biosphere on:
[...] several experiments involving 100 billion or more cells and lasting up to 20 days did not reveal a single microbe capable of surviving in the absence of the artificial supplement.
Sorry, I'm not too familiar with the community, so not sure if this question is about AI alignment in particular or risks more broadly. Assuming the latter: I think the most overlooked problem is politics. I worry about rich and powerful sociopaths being able to do evil without consequences or even without being detected (except by the victims, of course). We probably can't do much about the existence of sociopaths themselves, but I think we can and should think about the best ways to increase transparency and reduce inequality. For what it's worth, I'm a negative utilitarian.
↑ comment by plex (ete) · 2020-10-28T19:15:54.025Z · LW(p) · GW(p)
The latter is correct, non-AI risks are welcome.
↑ comment by steven0461 · 2020-10-28T19:08:20.310Z · LW(p) · GW(p)
Reducing long-term risks from malevolent actors [EA · GW] is relevant here.
↑ comment by Vanilla_cabs · 2020-10-28T21:51:52.147Z · LW(p) · GW(p)
I worry about rich and powerful sociopaths being able to do evil without consequences or even without being detected (except by the victims, of course).
Many methods used to avoid detection by general population also work on the victims, including:
- hiding the evil deed or casting doubt on its existence
- removing knoweledge of alternatives (silencing/redacting information about past and present alternatives), ending present alternatives
- demonizing alternatives
- guilt-tripping victims
- gaslighting
I'm worried about persuasion tools and the deterioration of collective epistemology they would likely bring. (I guess it's the deterioration I'm worried about, and persuasion tools are one way it could happen.) I hope to write a post about this soon.
I'm worried that much or most of the risk we're facing over the next 100 or so years come from technologies that are not even on our radar. We do not seem to have a great track record for predicting which advancements are coming, and we seem to be at least as bad at predicting how they will be used or which further advancements they will enable. It seems likely to me that AI will make discovery faster and possibly cheaper.
It makes sense to me that we focus on problems that are already on our radar (AI alignment, synthetic biology), and some of our efforts to mitigate those risks might transfer to whatever else we find ourselves up against. And some people do seem to be worried about risks from emerging tech in a very broad sense (Bostrom and others at FHI come to mind). But I'm not sure we're taking seriously enough the problem of dealing with entirely unforeseen technological risks.
↑ comment by steven0461 · 2020-10-30T21:22:45.713Z · LW(p) · GW(p)
On the other hand, to my knowledge, we haven't thought of any important new technological risks in the past few decades, which is evidence against many such risks existing.
That we run out of easily usable cheap energy sources and our civilization reverts to a kind of permanent static feudal / subsistence structure at best, if not outright collapse.
I think this is ignored for a number of reasons.
On the right, there is an assumption that fossil fuels are not going to run out, that fuel for nuclear reactors is basically infinite, etc. So no problem other than the "hoax" of global warming. I think this is wrong because a) GW is not a hoax, according to several months I spent looking into the issue, b) fossil fuels are going to run out fairly soon in terms of fuels that take less energy to extract than they produce, c) nuclear fuels would run out about as fast as fossil fuels if used at scale. The only exception being we could last a couple hundred years using breeder reactors but who wants 5,000 nuclear bomb raw material makers spread around the planet. And the faith in technology progress is overdone; nuclear fusion still seems a long way away, and the state of the art in batteries has only been getting better at about 7% per year.
On the left there is the belief that renewables will solve the problem. I have so far spent a couple of months trying to put together a picture of what a renewable economy would look like. So far it does not look like it adds up. The fundamental problems are 3-fold 1) the low density of renewables making the infrastructure extremely resource, energy and cost intensive 2) Batteries are nowhere near good enough for many key requirements (air travel, shipping, heavy transport and seasonal energy storage) 3) We don't have a solution for others (concrete, steel manufacture). There are a lot of unproven ideas but if you try to put together a solution from proven technologies it does not add up so far.
It is very difficult to impossible even to match existing economic output with renewables, but when you take into account locked in population growth and assume the rest of the world catches up to first world living standards then energy consumption becomes a huge multiple of current levels and it becomes absurdly out of reach.
Most proposed solutions have as a vital step "then a miracle occurs*". Out whole civilization is based on cheap energy. Without that, all our cleverness is in vain.
*As one example I was recently informed that my analysis had not taken into account the ability to mine uranium from asteroids and was thus faulty.
↑ comment by plex (ete) · 2020-10-29T14:25:53.065Z · LW(p) · GW(p)
I've run into people arguing this a few times, but so far no one continues the conversation when I start pulling out papers with recent EROEI of solar and the like (e.g. https://sci-hub.do/10.1016/j.ecolecon.2020.106726 which is the most recent relevant paper I could find, and says "EROIs of wind and solar photovoltaics, which can provide the vast majority of electricity and indeed of all energy in the future, are generally high (≥ 10) and increasing.").
Perhaps you will break the streak!
I am curious about the details of your model and the sources you're drawing from.
In particular, my understanding points at the curves of solar improvement being very positive, and already at a stage where they can provide enough energy to keep civilisation running, though it will require a large investment. Batteries also seem to be within reach of being viable, with a big push which will come as the grid becomes less stable and there is economic pressure to smooth out power.
I have not looked into it in detail, but Musk seems to think electric planes are possible in the near future. If that's the case, I imagine shipping would also be possible? Maybe significantly more expensive because batteries cost more, but I don't think civilisation collapses if you bring shipping costs up by some moderately large but not extreme factor?
Looking over the cement manufacturing process, it seems that energy is just needed for heat? If that's the case, what would stop electricity replacing fossil fuels as the energy source?
and assume the rest of the world catches up to first world living standards
This seems like an unreasonable assumption? It is probably correct that we can't bring everyone up to USA level consumption, but that does not mean that civilisation will collapse, just that we won't be able to fix inequality at current technology levels.
Replies from: waveman, ChristianKl, Grothor↑ comment by waveman · 2020-10-30T06:30:51.498Z · LW(p) · GW(p)
So many people have pointed me at that article that I am thinking about doing a copypasta.
tl;dr Overall, they are talking about solving a small part of the problem, in places that are unusually well favoured, using unrealistic assumptions.
Problem #1 is that it is almost entirely focused on electricity which is only roughly 25% of the problem.
Problem #2 is that it fails to take into account in its calculations the total system cost of delivering energy across the whole economy. It is not just a matter of solar cells. You need all the infrastructure, transmission lines (including for all that redundancy you need now), storage, etc.
Problem #3 is that it makes unrealistic assumptions. For example the use of pumped hydro for seasonal storage is just ludicrous. Do the math.
Problem #4 is that it only considers the problem for regions that are especially favoured: "regions with high solar and/or high wind resources".
I have done a ppt but I am revising it over the next weeks in response to comments. I will post it here when done.
> Musk seems to think
Argument from authority. In my ppt I only go with proven technologies, This rules out this kind of thing.
> the cement manufacturing process
The issue is that the process emits lots of CO2. For a renewable solution you need to expend large amounts of energy removing the CO2 from the air and finding a way to store it. There are ideas of how to store it but none that are proven at this scale (billions of tons)
> assume the rest of the world catches up to first world living standards
I briefly outlined two scenarios above. The first being to match existing energy use. The second taking into account growth in population and economic growth, which is happening, in LDCs. The first is very difficult at best, the second seems not at all possible barring a miracle. I do not see how you are going to stop the locked in population growth. and economic growth in LDCs is proceeding apace.
Multiplying the factors of population, reduced disparities, and modest growth in more developed countries results in an increase of 10-209 fold in energy consumption. Some people try to argue that we can have high economic growth without more energy but cross sectionally and temporally this would be very novel. Living standards and energy use are highly correlated. The one apparent exception, first world countries recently is just a result of outsourcing manufacturing to LDCs. When you take into account the energy embedded in imports, the richer countries are continuing to grow energy use rapidly (and effective CO2 emissions as well).
↑ comment by plex (ete) · 2020-10-30T11:57:22.741Z · LW(p) · GW(p)
Problem #1 is that it is almost entirely focused on electricity which is only roughly 25% of the problem.
What is the other 75% of the problem which can't be solved with electricity?
#2 <infrastructure>
Can see this raising the cost substantially, but given that only 8-9% of GDP is spent on energy, we can maybe eat that and survive?
#3 <pumped hydro for seasonal>
That does sound like a bad assumption, and lowers my opinion of any paper which makes it.
I have done a ppt but I am revising it over the next weeks in response to comments. I will post it here when done.
Looking forward to it.
For a renewable solution you need to expend large amounts of energy removing the CO2 from the air and finding a way to store it.
If the point of renewables is to stop climate change, yes. If the point is to keep civilisation running at all, no, you can just eat the CO2.
I do not see how you are going to stop the locked in population growth. and economic growth in LDCs is proceeding apace.
Population growth, agreed. But, if energy costs start seriously rising, economic growth will naturally slow or reverse, no?
↑ comment by ChristianKl · 2020-10-30T16:17:06.779Z · LW(p) · GW(p)
I have not looked into it in detail, but Musk seems to think electric planes are possible in the near future. If that's the case, I imagine shipping would also be possible?
A plane has to be able to produce enough energy with batteries for a few hours while an ocean tanker needs enough fuel for weeks.
Hydrogen fuel cells seem to be the better solution for ships.
↑ comment by Richard Korzekwa (Grothor) · 2020-10-29T15:40:53.526Z · LW(p) · GW(p)
It is probably correct that we can't bring everyone up to USA level consumption
Do you mean USA levels of consumption in the economic sense or just energy consumption? If the former, this seems like a really big deal to me. But I'm guessing it's not the case. Right now we do many things in energy inefficient ways, because energy is so inexpensive right now.
Replies from: waveman, ete↑ comment by waveman · 2020-10-30T06:33:37.317Z · LW(p) · GW(p)
> USA levels of consumption in the economic sense or just energy consumption?
Basically this is a false distinction. ( I did say first world originally not the US, which does seem to be somewhat profligate as a result of lower prices in the US, others tend to have high taxes on oil and coal).
Copied from my comment above:
> Some people try to argue that we can have high economic growth without more energy but cross sectionally and temporally this would be very novel. Living standards and energy use are highly correlated. The one apparent exception, first world countries recently is just a result of outsourcing manufacturing to LDCs. When you take into account the energy embedded in imports, the richer countries are continuing to grow energy use rapidly (and effective CO2 emissions as well).
↑ comment by Emiya (andrea-mulazzani) · 2020-10-31T17:43:20.974Z · LW(p) · GW(p)
The standards of life abruptly stop correlating with economic growth after a certain point. The whole first world has been far past that point from a while by now, economic growth only correlates with standard of life when it suddenly tanks hard in an economic crisis. Also Europe has the same living standards of the USA and a pro-capita carbon footprint that's about 1/4. Part of it might be due to Europe moving a lot of polluting industries outside it's borders, but it doesn't seem even close to explain all the gap.
↑ comment by plex (ete) · 2020-10-29T16:39:17.273Z · LW(p) · GW(p)
I mean for former, in terms of general economic wellbeing. It is a big deal and obviously bad if we can't bring everyone up to a decent level of economic prosperity, but it is not fatal to civilisation. We are already at current levels of inequality, and we still have a civilisation.
Replies from: waveman, Grothor↑ comment by waveman · 2020-10-30T06:44:42.564Z · LW(p) · GW(p)
The arguments for a possible collapse [I am on the fence] are roughly
1. Many civilizations in the past collapsed when deprived of their source of energy. A smooth transition to a lower level of energy use is not the norm though it has happened e.g. Byzantium.
2. Complex systems tend to be operating close to optimum, which makes for fragility. Turn the electricity off in NYC for two weeks and see what happens for example. More on this in books like "The Collapse of Complex Societies" by Joseph A. Tainter.
3. Our civilization is global thus the collapse would likely be global.
↑ comment by Richard Korzekwa (Grothor) · 2020-10-29T16:59:11.249Z · LW(p) · GW(p)
It is plausible to me that this would be fatal to our civilization, in the long run. Eventually we need to stop being biological humans living on the surface of Earth. It is not clear to me that we can move past that without much higher productivity than present day US.
Replies from: ete↑ comment by plex (ete) · 2020-10-29T17:21:30.644Z · LW(p) · GW(p)
I agree that if technological development productivity was held at a low level indefinitely that could be fatal, but that is a fairly different claim from the one waveman is making - which is that in the nearish term we will be unable to maintain our civilisation.
I am also hopeful that we can reach technological escape velocity with current or even fewer people with reasonable economic wellbeing.
Replies from: waveman, Grothor↑ comment by waveman · 2020-10-30T06:37:43.295Z · LW(p) · GW(p)
I am also hopeful
I don't really think you can make an argument that a renewable economy is viable based on hopium type arguments. As with the Club of Rome work, you would have to assume a massive increase in the rate of progress for this to work. In reality the problem seems to be the reverse - productivity increases seem to have slowed considerably.
There is a whole discussion about this both in the popular press and among economists https://time.com/4464743/productivity-decline/ https://www.intereconomics.eu/contents/year/2017/number/1/article/the-global-productivity-slowdown-diagnosis-causes-and-remedies.html
It is one thing even to assume present rates of improvement will continue, it is another to assume a dramatic turnaround against the current trend.
↑ comment by Richard Korzekwa (Grothor) · 2020-10-29T18:56:33.942Z · LW(p) · GW(p)
I do not mean technological development productivity, I mean economic productivity (how much stuff we're making, how many services we're providing).
↑ comment by ChristianKl · 2020-10-30T16:14:45.470Z · LW(p) · GW(p)
c) nuclear fuels would run out about as fast as fossil fuels if used at scale.
A decade ago everyone talked about peak oil and we are now at a moment of time where increased oil production seriously reduced oil prices so that faciliities get shut down.
There are possibilities to extract uranium from seawater and if there would be a higher demand for uranium there would be a lot of funding going into making that process efficient.
↑ comment by algon33 · 2020-10-30T12:42:24.024Z · LW(p) · GW(p)
What about Thorium? A back of the envelope calculation suggests thorium reactors could supply us with energy 100-500 years. I got this from a few sources. First used the figure of the 170 GW days produced per metric tonne of fuel (Fort St Vrain HTR) and the availability of fuel (500-2500 ktonnes according to Wikipedia) to estimate 10-50 years out of Thorium reactors if we keep using 15TW of energy. And that's not even accounting for breeding reactors, which can produce their own fuel. So if we do go with the theoretical maximum, then we should multiply this figure by 50. I'm basing that estimate of the (probably peak) fuel efficiency of Thorium from what Carlo Rubia of Cern said (see Wikipedia article above). That is, 1 tonne can provide 200 times more power than 1 tonne of Uranium. Since Uranium produces ~45 GW days per metric tonne of fuel, we get the estimae of 50 times. Then we get the figure of 500-2500 15TW years.
Supposing that we really need four or five times the amount of energy we actually use, leaves us with an upper bound of ten times the naive estimate. So I'd estimate thorium could provide 100-500 75TW years.
Time travel. As I understand it, you don't need to hugely stretch general relativity for closed timelike curves to become possible. If adding a closed timelike curve to the universe adds an extra constraint on the initial conditions of the universe, and makes most possibilities inconsistent, does that morally amount to probably destroying the world? Does it create weird hyper-optimization toward consistency?
I'm pretty sure we can leave this problem to future beings with extremely advanced technology, and more likely than not there are physical reasons why it's not an issue, but I think about it from time to time.
↑ comment by gbear605 · 2020-10-28T21:03:50.072Z · LW(p) · GW(p)
Could you expand on "you don't need to hugely stretch general relativity for closed timelike curves to become possible"? After all, the only thing keeping everyone from flying off of the planet is a minus sign in different equations relating to gravity, but we don't worry about that as a risk. Are the changes to general relativity similar to that, or more like "relativity allows it but it has some huge barrier that we can't currently surpass"?
Replies from: steven0461↑ comment by steven0461 · 2020-10-28T21:14:45.945Z · LW(p) · GW(p)
It's been a while since I looked into it, but my impression was something like "general relativity allows it if you use some sort of exotic matter in a way that isn't clearly possible but isn't clearly crazy". I could imagine that intelligent agents could create such conditions even if nature can't. The Internet Encyclopedia of Philosophy has a decent overview of time travel in general relativity.
There are a few things that concern me. CRISPR proliferation with gene drive tech is a near and looming threat. It won't be long before large numbers of individuals will be able to design and deploy genetically modified organisms with minimal regulation or oversight. The ways in which this can go wrong is limited only by your imagination.
In order of magnitude:
Psychological and political consequences of climate change leading to a significantly larger likelihood of botching AI and killing all present and future humans.
I can safely assert that if the expected consequences of climate change do show up, our society will become a lot more dumber and a lot more likely to screw up on this.
Imagine mankind being as close to AGI as we will be in ten years during a political situation similar to the Cold War. Things are likely to get at least as bad if most powers feel like they have to contend resources against each other, not to mention that we are likely to get a lot of nationalist governments when the global situation will get this chaotic and people will start to panic. From there to "if we don't improve our AIs the neighbouring hated enemy will and will crush for sure" it's a pretty brief step. Not to count "but we do have to fix this mess somehow, I'm sure you have already been careful enough and we are ready to do this", "or I'm sure we have been cautious enough, people are dying as we speak, we have to do this now".
Environmental and social consequences of climate change managing to collapse the existing civilisation by putting too much stress on the vulnerable systems needed to keep it alive (especially water and food) and such stress going into a positive feedback circle, with the result of giving permanent death to any present human who hasn't already received cryogenic and hidden his body/brain somewhere safe.
From what I've seen the models on climate change consequences are overly optimistic, simply because we have no real data on what actually happens if so many things change at once in the environment. These models describe consequences like one third of all animal and vegetal species going extinct, and one billion climate refugees. These are disasters one order of magnitude greater than anything we have directly observed so far, and it's a safe bet such big changes would produce a lot other effects.
Industrial overproduction and overcompsumtion of resources.
To actually get a society that won't crash horribly for resource exhaustion before we get AGI right, with consequences similar to the above point, it's urgent that we dial back industrial production a lot.
We are overproducing so much and wasting so many resources and outputting so much unnecessary pollution that it's ridiculous.
If you actually read the report of the Ipcc their suggested solutions are, very semplified: mostly don't waste, replace everything you can with cleaner energy sources, invest heavily in trying to reabsorb excess greenhouse gasses. Most of the analysis I saw, instead, claimed the climate crisis was unsolvable because renewables couldn't possibly provide all the energy that we need RIGHT NOW, not really considering how much are we using compared to what we need.
All of this excess production and pollution isn't even doing anything positive, it's not improving out living standards in any way, it's just a waste and a way for a small percentage of very rich people/corporations to make an increased profit (which doesn't fall back in any way to the general populace, as wealth concentration and quality of life clearly shows).
But if you mention we need to use less resources and decrease industrial production some people just seem to assume that you are a Luddite or that you want everyone to go back to the caves. I think there is something similar to cheering for the technology+industry team and that's a pretty dangerous bias.
It seems like the nanotech we get soon isn't grey goo based but protein folding based. Risks from having solved protein folding and being able to customly design proteins for specific ends seem not to be talked about.
I'm afraid that we're technologically developing too slowly and are going to lose the race to extraterrestrial civilizations that will either proactively destroy us on Earth or limit our expansion. One of the issues with this risk is that solving it runs directly counter to the typical solutions for AI-risk and other We're Developing Too Quickly For Our Own Good-style existential risks.
To prevent misaligned AI we should develop technology more slowly and steadily; MIRI would rather we develop AI 50 years from now than tomorrow to give them more time to solve the alignment problem. From my point of view that 50 years of slowed development may be what makes or breaks our potential future conflict with Aliens.
As technology level increases, smaller and smaller advantages in technological development time will necessarily lead to one-sided trounces. The military of the future will be able to absolutely crush the military from just a year earlier. So as time goes on it becomes more and more imperative that we increase our speed of technological development to make sure we never get trounced.
Risk due to weaponizing space and multiple countries having tungsten rods that are as destructive as nuclear weapons in orbit seem to be underappreciated.
2 comments
Comments sorted by top scores.
comment by Foyle (robert-lynn) · 2020-10-29T12:52:17.815Z · LW(p) · GW(p)
delete
Replies from: RedMan↑ comment by RedMan · 2020-10-30T18:33:38.745Z · LW(p) · GW(p)
Instead of uploading humans to create a large mess of AIs, let's connect humans together as soon as it's safe to do so (maybe at first only the elderly and bedridden, eventually anyone who can wear a hat) then add machines and maybe even animals (sup elephants and dolphins) to create a single gigantic worldbrain. As computer simulations of brain tissue get better, the AI will go from being mostly human to mostly artificial. The death of a fully integrated human body wouldn't cause an interruption in that human's consciousness, because most of it would be distributed across the entire worldbrain.
I believe that extant technology could be used to do this and actually wrote up a technical proposal that I didn't disseminate (it wasn't great and I didn't see anyone being persuaded by it so I trashed it). The technical risk is mostly in testing and some assumptions about the way the brain works that I view as 'plausible' given the state of the art, but far from 'proven'