Posts
Comments
It would be helpful to see a calculation with your rates, the installed cost of batteries, cost of the space taken up, losses in the batteries and convertor, any cost of maintenance, lifetime of batteries, and cost (or benefit) of disposal.
If you have 3 days worth of storage, even if you completely discharge it in 3 days and completely charge it in the next 3 days, you would only go through about 60 cycles per year. In reality, you might get 10 full cycles per year. With interest rates and per year depreciation, typically you would only look out around 10 years, so you might get ~100 discounted full cycles. That's why it makes more sense to calculate it based on capital cost as I have done above. If you're interested in digging deeper, you can get free off grid modeling software, such as the original version of HOMER (new versions you have to pay).
Even now at $1000/kW-hr retail it's almost cost-effective here to buy batteries to time-shift energy from solar generation to time of consumption. At $700/kW-hr it would definitely be cost-effective to do daily load-shifting with the grid as a backup only for heavily cloudy days.
Please write out the calculation.
Have there been some recent advances in compressed air energy storage? The information I read 2-3 years ago did not look promising at any scale.
Aboveground compressed air energy storage (tanks) is a little cheaper than chemical batteries. But belowground large compressed air energy storage is much cheaper for days of storage, with estimates around $1 to $10 per kilowatt hour. Current large installations are in particularly favorable geology, but we already store huge amounts of natural gas seasonally in saline aquifers. So we can basically do the same thing with compressed air, though the cycling needs to be more frequent.
That does sound like an excessive markup. But my point is even with the wholesale price, chemical batteries are nowhere near cost-effective for medium-term (days) electrical storage. Instead we should be doing pumped hydropower, compressed air energy storage, or building thermal energy storage (and eventually some utilization of vehicle battery storage because the battery is already paid for for the transport function). I talk about this more in my second 80k podcast.
Yes, but the rest of my comment focused on why I don't think defection from just the electric grid is close to economical with the same reliability.
But with what reliability? If you don't mind going without power (or dramatically curtailed power) a few weeks a year, then you could dramatically reduce the battery size, but most people in high income countries don't want to make that trade-off.
And so are batteries.
Lithium-ion batteries have gotten a lot cheaper, but batteries in general have not. Lithium ion are just now starting to become competitive with lead acid for non-mobile applications. It's not clear that batteries in general will get significantly cheaper.
It's going to make sense for a lot of houses to go over to solar + batteries. And if batteries are too expensive for the longest stretch of cloudy days you might have, at least here a natural gas generator compares favorably.
In your climate, defection from the natural gas and electric grid is very far from being economical, because the peak energy demand for the year is dominated by heating, and solar peaks in the summer, so you would need to have extreme oversizing of the panels to provide sufficient energy in the winter. But if you have a climate that has a good match between solar output and energy demand, it gets better (or if you only defect from the electric grid). Still, even if batteries got 3 times cheaper to say $60 per kilowatt hour, and you needed to store 3 days of electricity, that would be about $4300 per kilowatt capital cost, which is much more expensive than large gas power plants + electrical transmission and distribution. Another big issue is that reliability would not be as high as with the central grid in developed countries (though it very well could be more reliable than the grid in a low income country).
While a power station could be up to 63% efficient, for a home generator maybe I'm looking at something like the 23% efficient Generac 7171, rated for 9kW on natural gas at full load. Or maybe something smaller, since this is probably in addition to batteries and only has to match the house's average consumption. This turns my $0.06kWh into $0.24/kWh, plus the cost of the generator and maintenance.
Yes, you would only want around 1 kW electrical, especially because the only hope to make this economical when you count the capital cost and maintenance is to utilize a lot of the waste heat (cogeneration), ideally both for heating and for cooling (through an absorption cycle, trigeneration). But though I don't think it works economically for a household (even in your favorable case of low natural gas prices and high electricity prices), you can have an economical cogeneration/trigeneration installation for a large apartment building, and certainly for college campuses.
Stress during the day takes years off people's lives. Is there any evidence that stress during dreams (not necessarily nightmares) has a similar effect? Then that could be a significant benefit of lucid dreaming to reduce stress.
So this seems like very strong evidence for 2%+ productivity growth already from AI, which should similarly raise GDP.
If you actually take all the reports here seriously and extrapolate average gains, you get a lot more than 2%. Davidad estimates 8% in general.
The labour fraction of GDP is about 60% in the US now, and not all labour is cognitive tasks, and not all cognitive tasks have immediate payoff. Furthermore, people could use the time savings to work fewer hours, rather than get more done. So I would guess the productivity in cognitive tasks should be divided by something like 4 to get actual increase in GDP.
Asking an ASI to leave a hole in a Dyson Shell, so that Earth could get some sunlight not transformed to infrared, would cost It 4.5e-10 of Its income.
Interestingly, if the ASI did this, Earth would still be in trouble because it would get the same amount of solar radiation, but the default would be also receiving a similar amount of infrared from the Dyson swarm. Perhaps the infrared could be directed away from the earth, or perhaps an infrared shield could be placed above the earth or some other radiation management system could be implemented. Similarly, even if the Dyson swarm were outside the earth's orbit, Earth would also default get a lot of infrared from the Dyson swarm. Still, it would not cost the ASI very much more of its income to actually spare Earth.
Why does the chart not include energy? Prepared meals in grocery stores cost more, so their increased prevalence would be part of the explanation. Also, grains got more expensive in the last 20 years partly due to increased use in biofuels.
As I mentioned, the mass scaling was lower than the 3rd power (also because the designs went from fixed to variable RPM and blade pitch, which reduces loading), so if it were lower than 2.4, that would mean larger wind turbines would use slightly lower mass per energy produced. But the main reason for large turbines is lower construction and maintenance labour per energy produced (this is especially true for offshore turbines where maintenance is very expensive).
You could build one windmill per Autofac, but the power available from a windmill scales as the fifth power of the height, so it probably makes sense for a group of Autofacs to build one giant windmill to serve them all.
The swept area of a wind turbine scales as the second power of the height (assuming constant aspect ratios), and the velocity of wind increases with ~1/7 power with height. Since the power goes with the third power of the velocity, that means overall power ~height^2.4. The problem is that the amount of material required scales roughly with the 3rd power of the height. This would be exactly the case with constant aspect ratios. The actual case and the scale up of wind turbines over the last few decades has not scaled that fast, partly because of higher strength materials and partly because of optimization. Anyway, I agree there are economies of scale from micro wind turbines, but they aren't that large from a material perspective (mostly driven by labour savings).
Data centers running large numbers of AI chips will obviously run them as many hours as possible, as they are rapidly depreciating and expensive assets. Hence, each H100 will require an increase in peak powergrid capacity, meaning new power plants.
My comment here explains how the US could free up greater than 20% of current electricity generation for AI, and my comment here explains how the US could produce more than 20% extra electricity with current power plants. Yes, duty cycle is an issue, but backup generators (e.g. at hospitals) could come on during peak demand if the price is high enough to ensure that the chips could run continuously.
If you pair solar with compressed air energy storage, you can inexpensively (unlike chemical batteries) get to around 75% utilization of your AI chips (several days of storage), but I’m not sure if that’s enough, so natural gas would be good for the other ~25% (windpower is also anticorrelated with solar both diurnally and seasonally, but you might not have good resources nearby).
Natural gas is a fact question. I have multiple sources who confirmed Leopold’s claims here, so I am 90% confident that if we wanted to do this with natural gas we could do that. I am 99%+ sure we need to get our permitting act together, and would even without AI as a consideration…
A key consideration is that if there is not time to build green energy including fission, and we must choose, then natural gas (IIUC) is superior to oil and obviously vastly superior to coal.
My other comment outlined how >20% of US electricity could be freed up quickly by conservation driven by high electricity prices. The other way the US could get >20% of current US electricity for AI without building new power plants is running the ones we have more. This can be done quickly for natural gas by taking it away from other uses (the high price will drive conservation). There are not that many other uses for coal, but agricultural residues or wood could potentially be used to co-fire in coal power plants. If people didn’t mind spending a lot of money on electricity, petroleum distillates could be diverted to some natural gas power plants.
How are we getting the power? Most obvious way is to displace less productive industrial uses but we won’t let that happen. We must build new power. Natural gas. 100 GW will get pretty wild but still doable with natural gas.
If we let the price of electricity go up, we would naturally get conservation across residential, commercial, and industrial users. There are precedents for this, such as Juneau Alaska losing access to its hydropower plant and electricity getting ~6 times as expensive and people reducing consumption by 25%. Now of course people will complain and then they would support much more building, but we don’t have to do the building first to get 20% of current electricity production for AI.
For those thinking about carbon, doing it in America with natural gas emits less carbon than doing it in the UAE where presumably you are using oil. Emissions are fungible. If you say ‘but think of our climate commitments’ and say that it matters where the emissions happen, you are at best confusing the map for the territory.
Though there are instances in the Middle East of using oil for electric power, this only happens because of massive subsidies. The true cost is extremely expensive electricity, so I think UAE would be using natural gas.
Thanks for digging into the data! I agree that the rational response should be if you are predisposed to a problem to actively address the problem. But I still think a common response would be one of fatalism and stress. Have you looked into other potential sources of the nocebo effect? Maybe people being misdiagnosed with diseases that they don't actually have?
You might say that the persistence of witch doctors is weak evidence of the placebo effect. But I would guess that the nocebo effect (believing something is going to hurt you) would be stronger. This is because stress takes years off people's lives. The Secret of Our Success cited a study of the Chinese belief that birth year affects diseases and lifespan. Chinese people living in the US who had the birth year associated with cancer lived ~four years shorter than other birth years.
I did have some probability mass on AI boxing being relevant. And I still have some probability mass that there will be sudden recursive self-improvement. But I also had significant probability mass on AI being economically important, and therefore very visible. And with an acceleration of progress, I thought many people would be concerned about it. I don’t know as I would’ve predicted a particular chat-gpt moment (I probably would have guessed some large AI accident), but the point is that we should have been ready for a case when the public/governments became concerned about AI. I think the fact that there were some AI governance efforts before chat-gpt was due in large part to the people saying there could be slow take off, like Paul.
I'm surprised no one has mentioned Paul's long support (e.g.) of continuous progress meaning slow takeoff. Of course there's Hanson as well.
Interesting - I was thinking it was going to be about the analogy with collapse of civilization and how far we might fall. Because I am concerned that if we have a loss of industrial civilization, we might not be able to figure out how to go back to subsistence farming, or even hunting and gathering (Secret of Our Success), so we may fall to extinction. But I think there are ways of not pulling up the ladder behind us in this case as well (planning for meeting basic needs in low tech ways).
I don't have a strong opinion because I think there's huge uncertainty in what is healthy. But for instance, my intuition is that a plant-based meat that had very similar nutritional characteristics as animal meat would be about as healthy (or unhealthy) as the meat itself. The plant-based meat would be ultra-processed. But one could think of the animal meat as being ultra-processed plants, so I guess one could think that that is the reason that animal meat is unhealthy?
To me "generally avoid processed foods" would be kinda like saying "generally avoid breathing in gasses/particulates that are different from typical earth atmosphere near sea level".
People have been breathing a lot of smoke in the last million years or so, so one might think that we would have evolved to tolerate it, but it's still really bad for us. Though there are certainly lots of ways to go wrong deviating from what we are adapted to, our current unnatural environment is far better for our life expectancy than the natural one. As pointed out in other comments, some food processing can be better for us.
Kuhlemann argues that human overpopulation is the best example of an “unsexy” global catastrophic risk, but this is not taken seriously by the vast majority of global catastrophic risk scholars.
I think the reason overpopulation is generally not taken seriously by the GCR community is that they don't believe it would be catastrophic. Some believe that there would be a small reduction in per capita income, but greater total utility. Others argue that having more population would actually raise per capita income and could be key to maintaining long-term innovation.
This is a tricky thing to define, because by some definitions we are already in the 5 year count-down on a slow takeoff.
Some people advocate for using GDP, so the beginning is if you can see the AI signal in the noise (which we can't yet).
Nuclear triad aside, there's the fact that the Arctic is more than 1000 miles away from the nearest US land (about 1700 miles away from Montana, 3000 miles away from Texas), that Siberia is already roughly as close.
Well, there’s Alaska, but yes, part of Russia is only ~55 miles away from Alaska, so the overall point stands that Russia having a greater presence in the Arctic doesn't change things very much.
And of course, the fact the Arctic is made of, well, ice, that melts more and more as the climate warms, and thus not the best place to build a missile base on.
That’s not what is being proposed - it is building more bases in ports on the land where the water doesn’t freeze as much because of climate change.
If negative effects are worse than expected, it can't be reversed.
I agree that MCB can be reversed faster, but still being able to reverse in a few years is pretty responsive. There are strong interactions with other GCRs. For instance, here's a paper that argues that if we have a catastrophe like an extreme pandemic that disrupts our ability to do solar radiation management (SRM), then we could have a double catastrophe of rapid warming and the pandemic. So this would push towards more long-term SRM, such as space systems. However, there are also interactions with abrupt sunlight reduction scenarios such as nuclear winter. In this case, we would want to be able to turn off the cooling quickly. And having SRM that can be turned off quickly in the case of nuclear winter could make us more resilient to nuclear winter than just reducing CO2 emissions.
What about Wait But Why?
Nice summary! My subjective experience participating as an expert was that I was able to convince quite a few people to update towards greater risk by giving them some considerations that they had not thought of (and also by clearing up misinterpretations of the questions). But I guess in the scheme of things, it was not that much overall change.
What I wanted was a way to quantify what fraction of human cognition has been superseded by the most general-purpose AI at any given time. My impression is that that has risen from under 1% a decade ago, to somewhere around 10% in 2022, with a growth rate that looks faster than linear. I've failed so far at translating those impressions into solid evidence.
This is similar to my question of what percent of tasks AI is superhuman at. Then I was thinking if we have some idea what percent of tasks AI will become superhuman at in the next generation (e.g. GPT5), and how many tasks the AI would need to be superhuman at in order to take over the world, we might be able to get some estimate of the risk of the next generation.
I agree that indoor combustion producing small particles that go deep into the lungs is a major problem, and there should be prevention/mitigation. But on the dust specifically, I was hoping to see a cost-benefit analysis. Since most household dust is composed of relatively large particles, they typically do not penetrate beyond the nose and throat, and so are more of an annoyance than something that threatens your life. So I am skeptical if one doesn’t have particular risk factors such as peeling lead paint or allergies, measures such as regular dusting (how frequently are you recommending?), not wearing shoes in the house, having hardwood floors if you like the benefits of carpet such as sound absorption, etc would be cost-effective when you value people’s time.
Recall that GPT2030 could do 1.8 million years of work[8] across parallel copies, where each copy is run at 5x human speed. This means we could simulate 1.8 million agents working for a year each in 2.4 months.
You point out that human intervention might be required every few hours, but with different time zones, we could at least have the GPT working twice as many hours a week as humans, so that would imply ~1 month above. As for the speed now, you say about the same to three times as fast for thinking. You point out that it also does writing, but it is verbose. However, for solving problems like that coding interview, it does appear to be an order of magnitude faster already (and this is my experience solving physical engineering problems).
AI having scope-sensitive preferences for which not killing humans is a meaningful cost
Could you say more what you mean? If the AI has no discount rate, leaving Earth to the humans may require within a few orders of magnitude 1/trillion kindness. However, if the AI does have a significant discount rate, then delays could be costly to it. Still, the AI could make much more progress in building a Dyson swarm from the moon/Mercury/asteroids with their lower gravity and no atmosphere, allowing the AI to launch material very quickly. My very rough estimate indicates sparing Earth might only delay the AI a month from taking over the universe. That could require a lot of kindness if they have very high discount rates. So maybe training should emphasize the superiority of low discount rates?
I think "50% you die" is more motivating to people than "90% you die" because in the former, people are likely to be able to increase the absolute chance of survival more, because at 90%, extinction is overdetermined.
When asked on Lex’s podcast to give advice to high school students, Elezier’s response was “don’t expect to live long.”
Not to belittle the perceived risk if one believes in 90% chance of doom in the next decade, but even if one has a 1% chance of an indefinite lifespan, the expected lifespan of teenagers now is much higher than previous generations.
Right, both ChatGPT and Bing chat recognize it as a riddle/joke. So I don't think this is correct:
If you ask GPT- "what's brown and sticky?", then it will reply "a stick", even though a stick isn't actually sticky.
Very useful post and discussion! Let's ignore the issue that someone in capabilities research might be underestimating the risk and assume they have appropriately assessed the risk. Let's also simplify to two outcomes of bliss expanding in our lightcone and extinction (no value). Let's also assume that very low values of risk are possible but we have to wait a long time. It would be very interesting to me to hear how different people (maybe with a poll) would want the probability of extinction to be below before activating the AGI. Below are my super rough guesses:
1x10^-10: strong longtermist
1x10^-5: weak longtermist
1x10^-2 = 1%: average person (values a few centuries?)
1x10^-1 = 10%: person affecting: currently alive people will get to live indefinitely if successful
30%: selfish researcher
90%: fame/power loving older selfish researcher
I was surprised that my estimate was not more different for a selfish person. With climate change, if an altruistic person affecting individual thinks the carbon tax should be $100 per ton carbon, a selfish person should act as if the carbon tax is about 10 billion times lower, so ten orders of magnitude different versus ~one order for AGI. So it appears that AGI is a different case in that the risk is more internalized to the actors. Most of the variance for AGI appears to be from how longtermist one is vs whether one is selfish or altruistic.
Denkenberger posted two papers he wrote in regards to a 150Tg nuclear exchange scenario (worst case scenario, total targeting of cities). As far as I can tell, although the developed world doesn't come close to famine and there is theoretically enough food to feed everyone on Earth
To clarify, the world would have enough food if trade continues and if we massively scale up resilient foods. Trade continuing is very uncertain, and making it likely that we scale up resilient foods would require significantly more planning and piloting.
For the one paper, it is too early to tell. For the other, there just has not been very much engagement. Mainly the public debate has been between the Robock team, which is highly confident that full-scale nuclear war would cause nuclear winter, and the Los Alamos team, which is highly confident that full-scale nuclear war would not cause nuclear winter. We find the truth is likely somewhere in between. I talked about this in one of my 80k podcasts. Our analysis is quite similar to Luisa Rodriguez' analysis that cubefox links to below.
Thanks, Peter. That draft assumes global cooperation, which is likely too optimistic, so we have submitted another draft that also analyzes the case of breakdown of trade (hopefully public soon). We also have this paper that looks at the US specifically and takes into account food storage (and uncertainty of whether nuclear war would result in nuclear winter).
Great post! I've been mentioning for years that volunteering can be an effective way of making a contribution. Though many people think of volunteering as for a specific organization, I don't think it has to be, so a hobby could be an example. I think there are not enough volunteer opportunities in EA, and we've worked hard at ALLFED on our volunteer program. Not only have we had dozens of volunteers skill up, but they have also made significant contributions, often co-authoring journal articles and becoming full time staff. Thanks for the shout out! I'm actually still volunteering for ALLFED (and donating).
I'm probably a bit more concerned about monkeypox than you are, mainly because it has an alarmingly long incubation period (up to 14 days) and then a punishingly long infectious period (3-4 weeks).
So with doubling every 10.5 days, that would seem to mean a high R0 - what's your estimate? And really because some people are still being cautious about COVID, the true R0 (with normal behavior) would be even higher than what is measured now.
I would say that is basically right. AC exhaust is about as humid as indoor air. The fraction of the heating load in the summer due to infiltration really does depend on how tight your building construction is. With the numbers Jeff was assuming for a very old house, infiltration would be a much larger percentage. There are some other sources of heat in a house that come with humidity, such as people and showers, but overall it is much less humidity than bringing in outdoor air (there is heat conduction through the walls, electricity use of lighting and appliances, etc.). So that might mean that it would take you from a 25% efficiency loss (ignoring humidity) up to a 35% efficiency loss, which is still a big deal. But I'm not sure if 85°F in California typically corresponds to 50% relative humidity.
If you want to geek out on this you can use a psychrometric chart. For instance, if outdoor air is 85F and 50% relative humidity (RH), that's an enthalpy of about 35 BTU/lb of dry air. Typical exit air conditions on the cool side of an air conditioner are ~50F and 100% RH, so ~20 BTU/lb of dry air. The dehumidification portion would be going to 85F and ~30% RH or ~29 BTU/lb of dry air, so ~40% of the heat removed is in the form of condensing water (latent). This means you would take the sensible part and multiply by about 1.7 to get the total load on the air conditioner. If you were not drawing in outdoor air, the latent load would be much lower. So overall I think you're right that in CA the humidity correction is not as big as the other factors.
The thermal time constant of a building is around a day, so you should really be running each of these tests for more than a day (and correcting for differences in ambient conditions). Basically, the control should exceed the average ambient temp because of solar and internal (e.g. electricity consumption) gains. And see my other comment about doing something about humidity removal. Then we might actually have something rigorous (based on doing an experiment with fairly expensive equipment, I still had error bars around +/-1°C, so I don't think you have very much confidence at this point).
I must admit I was surprised by the statistics here. It is true if you only use the air conditioner few days a year, the energy efficiency is not important. However, the cooling capacity is important. I think many people are using efficiency to mean cooling capacity above. Anyway, let's say the incremental cost of going from one hose to two hoses is $30. From working on Department of Energy energy efficiency rules, typically the marginal markup of an efficient product is less than the markup on the product overall (meaning that the incremental cost of just adding a hose is less than the $20 of buying it separately). It is true that with a smaller area for the air to come into the device with a hose, the velocity has to be higher, so the fan blades need to be made bigger (it typically is one motor powering two different fan blades on two sides, at least for window units). But then you could save money on the housing because the port is smaller. The incremental cost of motors is low. Then if the air conditioner cost $200 to start with, that would be 15% incremental cost. Then let's say the cooling capacity increased by 25% (I would say it actually does matter that a T-shirt was used, which would allow room area and instead of just outdoor air, so it probably would be higher than this). What this means is that the two hose actually has greater cooling capacity per dollar, so you should choose a small two hose even if you don't care about energy use at all. Strictly this is only true with no economies of scale, which is not a great assumption. But I think overall it will hold. Another case this would break down is if a person were plugging and unplugging many times, but I don't think that's the typical person. So I suspect what is going on is that people don't realize that the cooling capacity of the one hose is actually reduced more than the cost, so they should just be getting a smaller capacity two hose unit (at lower initial cost and energy cost).
There is a broader question here of whether there should be energy efficiency regulations. If people were perfectly rational and had perfect information, we would not need them. But not only are the incremental costs of energy efficiency regulations found to be economically beneficial by the US Department of Energy (basically a good return on investment), but a retrospective study found that the actual incremental cost of meeting the efficiency regulations was about an order of magnitude lower than predicted by the Department of Energy! So I think there's a very strong case for energy efficiency regulations.
I overlooked a crucial consideration raised by denkenberger here that reduces the efficiency loss ~2x.
Thanks-it looks like you are referring to the net infiltration flow rate impact on the building. But there was also the consideration of humidity, and I did not see any humidity measurements in the data, so we are not able to resolve that one. Humidity sensors are fairly cheap, but notoriously unreliable. But one could actually measure the amount of water condensed pretty accurately to get an idea how much of the cooling of the air conditioner is going to condensing water versus cooling air (sensibly).
What is your estimate of the Metaculus question "Will there be a positive transition to a world with radically smarter-than-human artificial intelligence?" It sounds like it is much lower than the community prediction of 55%. Do you think this is because the community has significant probability mass on CAIS, ems, or Paul-like scenarios? What probability mass do you put on those (and are there others)?
Yes, 0.35 ACH is for the whole house. Most houses do not have active ventilation systems, so that's all you would get for the bedroom. But that is true that if you are worried about CO2, you should have higher ACH in bedrooms. But this recommendation is not just about CO2, but also things like formaldehyde. Also it is roughly the amount that houses get on average. I have seen studies showing that the cost of sick building syndrome is well worth having higher ventilation rates. So probably more houses should have active ventilation. But if you don't have active ventilation in a house, I think 0.35 ACH is a reasonable average. Apartment buildings will have active ventilation and higher occupant density, so the ACH will generally be higher, as you point out.
Yes - it is quite leaky - the rule of thumb the American Society of Heating, Refrigerating and Air Conditioning Engineers for low rise residential is more like 0.3 ACH. This would make your filtration look a lot better.