How to Think About Climate Models and How to Improve Them
post by clans · 2022-12-07T19:37:14.298Z · LW · GW · 0 commentsThis is a link post for https://locationtbd.home.blog/2022/12/05/how-do-climate-models-work/
Contents
No comments
I wrote on the analysis of the skill of past climate models and addressed some critical constraints the techniques currently employed are bumping up against. I point out that it is probably time to turn to the market to do a lot of this work, particularly with regard to the risk of flood and drought prone areas, which are often under-developed.
On current constraints:
...
Though the IPCC continues to put out assessment reports with more sophisticated models, the increase in complexity has been hard fought. In the 90s the average resolution was 300 km and today it between 100-50 km, but each halving requires an order of magnitude greater compute. High performance computing (HCP) for weather systems is already a significant use for such systems globally and today models are being developed that simulated on the 1 km scale for ~10 days. These models are capable of representing complex interactions between vegetation and soil carbon, marine ecosystems and ocean currents, and human activities which can then be piped into larger models to predict local climate instabilities and extremes over longer periods.
I think the demand for weather simulations is a good case study here. Regional weather models now run at about 10-1km resolutions, needed to predict convection and rainfall, having improved rapidly since the 90s. Predictions of weather events on the timeframe of days are extremely valuable, particularly for energy markets, and the US market has grown around a 9% CAGR in the past decade or so. NOAA has found that a 3-6% variability in US GDP can be attributed to weather while the average number of $1bn weather events in the US 2008-2015 doubled compared to the previous 35 years.
Besides exploring novel architectures for HCP, the trick to circumventing the computationally demanding constraints of next generation climate models is focusing on leveraging the massive influx in high-fidelity data from earth-observation satellites, which are in the middle of experiencing a renaissance due to both exponential launch cost reductions and electronics miniaturization. I wrote about it in this post. The amount of this Earth Observation (EO) data doubles in less than every two years now yet much of it is un-utilized due to lack of continuity and validation. Data is messy.
Using these massive data sets for feeding high fidelity simulations on the order of 10-1 km is a flywheel for overcoming the hurdle of computational constraints. In much the same way as local weather simulations, there is a large and growing market for these simulations in the context of extreme weather events. EO data can be used for high sensitivity simulation of coastal sea level rises, flooding in delta areas or agriculturally-heavy regions in both developed and developing countries, predicting phytoplankton blooms and marine ecosystem health. These are all massively economically disruptive events, on the order of billions. Plus, the satellite data comes in effectively real time so errors can be fed in with low latency to models. Satellite-informed models give water vapor, soil moisture, evapotranspiration rates, surface water, ice, and snow quantities at better fidelities by the month – these data will allow us to industrially monitor flood and drought risks. Satellite data companies like Descartes Labs are even managing anti-deforestation through real-time monitoring products. This data is accessible and it powers valuable engines – the cost of the compute can become an afterthought.
...
Read the rest on via the link
0 comments
Comments sorted by top scores.