testingthewaters's Shortform
post by testingthewaters · 2025-02-10T02:06:40.503Z · LW · GW · 3 commentsContents
3 comments
3 comments
Comments sorted by top scores.
comment by testingthewaters · 2025-02-10T02:06:40.501Z · LW(p) · GW(p)
Note to self: If you think you know where your unknown unknowns sit in your ontology, you don't. That's what makes them unknown unknowns.
If you think that you have a complete picture of some system, you can still find yourself surprised by unknown unknowns. That's what makes them unknown unknowns.
If your internal logic has almost complete predictive power, plus or minus a tiny bit of error, your logical system (but mostly not your observations) can still be completely overthrown by unknown unknowns. That's what makes them unknown unknowns.
You can respect unknown unknowns, but you can't plan around them. That's... You get it by now.
Therefore I respectfully submit that anyone who presents me with a foolproof and worked-out plan of the next ten/hundred/thousand/million years has failed to take into account some unknown unknowns.
Replies from: CapResearcher↑ comment by CapResearcher · 2025-02-10T15:13:41.845Z · LW(p) · GW(p)
I could feel myself instinctively disliking this argument, and I think I figured out why.
Even though the argument is obviously true, and it is here used to argue for something I agree with, I've historically mostly seen this argument used to argue against things I agree with. Specifically, arguing to disregard experts, and to argue that nuclear power should never be built, no matter how safe it looks. Now this explains my gut reaction, but not whether it's a good argument.
When thinking through it, my real problem with the argument is the following. While it's technically true, it doesn't help locate any useful crux or resolution to a disagreement. Essentially, it naturally leads to a situation where one party estimates the unknown unknowns to be much larger than the other party, and this is the crux. To make things worse, often one party doesn't want to argue for their estimate of the size of the unknown unknowns. But we need to estimate sizes of unknown unknowns, otherwise I can troll people with "tic-tac-toe will never be solved because of unknown unknowns".
I therefore feel better about arguments for why unknown unknowns may be large, compared to just arguing for a positive probability of unknown unknowns. For example, society has historically been extremely chaotic when viewed at large time scales, and we have numerous examples of similar past predictions which failed because of unknown unknowns. So I have a tiny prior probability that anyone can accurately predict what society will look like far into the future.
Replies from: testingthewaters↑ comment by testingthewaters · 2025-02-10T15:19:59.948Z · LW(p) · GW(p)
Yeah, definitely. My main gripe where I see people disregarding unknown unknowns is a similar one to yours- people who present definite worked out pictures of the future.