Alex Lawsen On Forecasting AI Progress

post by Michaël Trazzi (mtrazzi) · 2022-09-06T09:32:54.071Z · LW · GW · 0 comments

This is a link post for https://theinsideview.ai/alex

Contents

  On the Metaculus AGI Forecasts
    Why You Cannot Just Update All The Way
    The Metaculus Drops Were Not Caused By Newcomers
  On Using Growth Models To Forecast AGI
    Business As Usual Does Not Require Burden Of Proof
    Growth Models Are Not Sufficient To Forecast AI Progress
None
No comments

Alex Lawsen is an advisor at 80,000hours who has released an Introduction to Forecasting. We discuss pitfalls that happen when forecasting AI progress, why you cannot just update all the way (discussed in my latest episode with Connor Leahy) and how to develop your own inside views about AI Alignment.

Below are some highlighted quotes from our conversation (available on Youtube, Spotify, Google Podcast, Apple Podcast). For the full context for each of these quotes, you can find the accompanying transcript.

On the Metaculus AGI Forecasts

Why You Cannot Just Update All The Way

"There are some situations where all of the positive evidence you get is going to be in the same direction, and then the negative evidence you get is nothing happens. And so, ideally, what you do in this case is every day that nothing happens, you make a tiny update in one direction. And then every few weeks or every few months, something big happens and you make an update in the other direction. And if that is the case, maybe what you'll see is people just... they forget to do the small downwards updates and then they do the big updates every time something happens. And I think if you do the Connor thing of seeing... Well, I'm not too sure this is the Connor thing. But if you see four updates and they're all in the same direction and then you go like, 'Oh, man, everything's going the same direction. I need to be really confident stuff going that direction.' Then every day something doesn't happen, your downwards update needs to be pretty big. If you're expecting massive progress, then a week going by and nothing happening is actually big evidence for you."

The Metaculus Drops Were Not Caused By Newcomers

"One hypothesis you might have which I think a friend of mine falsified, is 'a whole bunch of people saw these results. These results were all over Twitter, it was impressive. Chinchilla was impressive, PaLM was impressive'. So, you might think, 'Oh, well, a bunch of new people who haven't made timelines forecasts before are going to jump on this Metaculus question and they're going to make predictions.' And so, you can test this, right. You can look at how the median changed among predictors who had already predicted on the question and that median dropped too."

On Using Growth Models To Forecast AGI

Business As Usual Does Not Require Burden Of Proof

"I think there was a class of skepticism about safety or skepticism about AGI, which goes something like this, 'In general, you should use reference classes to determine your forecasts.' What this means roughly translated, is you should predict things to carry on roughly how they are. And then people say, 'Things carrying on roughly how they are doesn’t look like we get AI takeover and everyone dropping dead' so you should have a very high burden of proof for the step by step arguments, logical arguments, in order to claim we are going to get something wild like AGI in the next few decades. And I think a really strong response to this line of argument is to say, 'What do you mean everything continues as normal means we don’t get anything weird?' 'Everything continues as normal' means we should look at curves and different things and expect them to carry on smoothly. And if you look at curves and a bunch of different things and expect them to carry on smoothly, you get really weird behavior pretty quickly."

Growth Models Are Not Sufficient To Forecast AI Progress

"Curve fitting to economic growth models is not sufficient reason to believe that on its own. You can then look at the development of AGI and predict that happens by 2050 and then you can say, 'Wow, economic stuff’s going to go wild after that point.' But then the reason you’re saying that is because of a combination of facts, including actually having a gears level model of what’s happening... The growth models are, in my view, sufficient to say you should look at the next few decades carefully and see what’s going on, but are not sufficient on their own to allow you to confidently predict what will happen."

0 comments

Comments sorted by top scores.