What Are The Preconditions/Prerequisites for Asymptotic Analysis?
post by DragonGod · 2023-02-03T21:26:48.987Z · LW · GW · No commentsThis is a question post.
Contents
Answers 9 Taran 5 tailcalled None No comments
Under what conditions is describing the behaviour of a (physical or abstract) process using asymptotic apparatus sensible/productive/reasonable?
Kinds of replies that I would find especially valuable:
- Formal properties that a function (or collection thereof) must satisfy for applying asymptotic analysis to be sensible
- Heuristics for when applying asymptotic analysis to real world processes is productive
- Examples of real world processes satisfying the above
- Examples of processes/functions that might seem amenable to asymptotic analysis but which it would actually be a mistake to try and consider through an asymptotic lens
Answers
Strictly speaking asymptotic analysis is not very demanding: if you have a function that you can bound above in the limit as a function of , you can do asymptotic analysis to it. In practice I mostly see asymptotic analysis used to evaluate counterfactuals: you have some function or process that's well-behaved for inputs, and you want to know if it will still be well-enough behaved if you had inputs instead, without actually doing the experiment. You're rendering ten characters on the screen in your video game -- could you get away with rendering 20, or would you run out of graphics card memory? Your web site is serving 100 requests per second with low latency -- if you suddenly had 1000 requests per second instead, would latency still be low? Would the site still be available at all? Can we make a large language model with the exact same architecture as GPT-3, but a book-sized context window? Asymptotic analysis lets you answer questions like that without having to do experiments or think very hard -- so long as you understand correctly.
When I'm reviewing software designs, I do this kind of analysis a lot. There, it's often useful to distinguish among average-case and worst-case analyses: when you're processing 100 million records in a big data analysis job you don't care that much about the variance in processing any individual record, but when you're rendering frames in a video game you work hard to make sure that every single frame gets done in less than 16 milliseconds, or whatever your budget is, even if that means your code is slower on average.
This makes it sound like a computer science thing, and for me it mostly is, but you can do the same thing to any scaling process. For example, in some cultures, when toasting before drinking, it's considered polite for each person to toast each other person, making eye contact with them, before anyone drinks for real. If you're in a drinking party of people, how many toasts should there be and how long should we expect them to take? Well, clearly there are about pairs of people, but you can do toasts in parallel, so with good coordindation you should expect the toasts to take time proportional to ...
...except that usually these toasts are done in a circle, with everyone holding still, so there's an additional hard-to-model constraint around the shared toasting space in the circle. That is to me a prototypical example of the way that asymptotic analysis can go wrong: our model of the toasting time as a function of the number of people was fine as far as it went, but it didn't capture all the relevant parts of the environment, so we got different scaling behavior than we expected.
(The other famous use of asymptotic analysis is in hardness proofs and complexity theory, e.g. in cryptography, but those aren't exactly "real world processes" even though cryptography is very real).
Asymptotic analysis tends to be useful when you expect the function to take large values as input, but don't necessarily know how large they will be. For instance, knowing that the asymptotic complexity of sorting is isn't all that useful when you are solving a list of 7 elements; it's mainly useful when considering sorting lists of 1000000000 elements (where you can infer that e.g. would be too expensive but is not).
I guess it is also sometimes useful as a sort of "interpolation strategy". Like if you want to know the properties of a function, you can first asymptotically characterize its limiting properties, and that covers a good chunk of the information about the function, leaving you only with a bounded area of uncertainty. Then you can analyze how it shifts between its asymptotes in the middle.
No comments
Comments sorted by top scores.