Posts

The Utility Function of a Prepper 2021-07-21T02:26:19.214Z
Paper Review: Mathematical Truth 2021-05-30T20:25:09.525Z
Against Being Against Growth 2021-05-29T02:18:37.660Z
Get your gun license 2021-05-21T12:44:58.055Z
Under Pressure 2021-05-14T22:04:06.017Z

Comments

Comment by Alex Hollow on The Utility Function of a Prepper · 2021-07-22T15:07:47.018Z · LW · GW

I'd be careful with thinking of prepping as a binary "do/don't prep" distinction. If you live somewhere where a civil war happens every 2-3 years, the expected value of something that only has value in a civil war scenario is much higher than if one happens every 150 years or so. However, that doesn't mean you should "prep" in one case and not the other, just that some actions that would be worth it if civil wars were frequent are not worth it if civil wars are infrequent. Water may be useful in both, but training your friends in wilderness survival or whatever, maybe less so.

Comment by Alex Hollow on The Utility Function of a Prepper · 2021-07-21T14:36:00.915Z · LW · GW

I don't think I understand the question. If something is expensive but will definitely (let's say with 99% certainty) save my life (which I think is the sort of thing you are describing as expensive_but_must), I would buy it at almost any cost. 

Comment by Alex Hollow on The Utility Function of a Prepper · 2021-07-21T11:26:24.860Z · LW · GW

buying a bunker is not frequent that much anymore

Are you sure? The second doom boom is here, and people are buying bunkers again.

The difference between bunkers and water is not just the cost, but the probability of needing one - there are many non-nuclear-war cases for wanting water on hand. So water has a higher probability of being useful, and a lower cost.

Comment by Alex Hollow on The Utility Function of a Prepper · 2021-07-21T11:23:51.101Z · LW · GW

Most places have water, but how close is it to where you live? If you don't have a way of storing a significant amount of water, and live far enough from your local water source that you would have to drive, there is a benefit to having enough water storage so that you can transport a reasonable amount of water per trip.

But I agree that having a water filter on hand is useful in cases where you have access to water, but you aren't sure whether it's safe to drink or not.

Comment by Alex Hollow on Agency and the unreliable autonomous car · 2021-07-07T19:13:55.340Z · LW · GW

I think your definition of perfect model is a bit off - a circuit diagram of a computer is definitely not a perfect model of the computer! The computer itself has much more state and complexity, such as temperature of the various components, which are relevant to the computer but not the model. 

Containing a copy of your source code is a weird definition of a model. All programs contain their source code, does a program that prints it source code have more of a model of itself than other programs, which are just made of their source code? Human brains are not capable of holding a perfect model of a human brain, plus more, because the best-case encoding requires using one atom of the human brain to model an atom in the human brain, leaving you at 100% capacity.

The key word is "perfect" - to fit a model of a thing inside that thing, the model must contain less information than the thing does.

Comment by Alex Hollow on Agency and the unreliable autonomous car · 2021-07-07T16:28:57.108Z · LW · GW

This is a really good illustration of the 5 and 10 problem. I read the linked description, but didn't fully understand it until you walked through this example in more depth. Thanks!

One possible typo:  You used 5a and 5b in most of the article, but in the original list of steps they are called 6 and 7.

Comment by Alex Hollow on Paper Review: Mathematical Truth · 2021-06-04T12:08:12.546Z · LW · GW

This is definitely not a "big problem" in that we can use math regardless of what the outcome is. 

It sounds like you're arguing that semantic uniformity doesn't matter, because we can change what "exists" means. But once you change what "exists" means, you will likely run into epistemological issues. If your mathematical objects aren't physical entities capable of interacting with the world, how can you have knowledge that is causally related to those entities? That's the dilemma of the argument above - it seems possible to get semantic uniformity at the expense of epistemological uniformity, or vice versa, but having both together is difficult.

Comment by Alex Hollow on Paper Review: Mathematical Truth · 2021-06-04T12:05:41.057Z · LW · GW

I'm not super up-to-date on fictionalism, but I think I have a reasonable response to this.

When we are talking about fictional worlds, we understand that we have entered a new form of reasoning. In cases of fictional worlds, all parties usually understand that we are not talking about the standard predicate, "exists", we are talking about some other predicate, "fictionally-exists". You can detect this because if you ask people "do those three Jedi really exist?", they will probably say no. 

However, with math, it's less clear that we are talking fictionally or talking only about propositions within an axiomatic system. We could swap out the "exists" predicate with something like "mathematically-exists" (within some specific axiom system), but it's less clear what the motivation is compared to fictional cases. People talk as if 2+2 does really equal 4, not just that its useful to pretend that it's true. 

Comment by Alex Hollow on Paper Review: Mathematical Truth · 2021-06-04T11:59:09.592Z · LW · GW

Hi! I really appreciate this reply, and I stewed on it for a bit. I think the crux of our disagreement comes down to definitions of things, and that we mostly agree except for definitions of some words.

Knowledge - I think knowledge has to be correct to be knowledge, otherwise you just think you have knowledge. It seems like we disagree here, and you think that knowledge just means a belief that is likely to be true (and for the right reason?). It's unclear to me how you would cash out "accurate map" for things that you can't physically observe like math, but I think I get the gist of your definition. Also, side note, justified true belief is not a widely held view in modern philosophy, most theories of truth go for justified true belief + something else.

Real - We both agree it doesn't matter for our day-to-day lives whether math is real or not. (It may matter for patent law, if it decides whether math is treated as an invention or a discovery!) I think that it would be nice to know whether math is real or not, and I try to understand the logical form of sentences I utter to know what fact about the world would make them true or false. So you say I "don't have to worry about" whether numbers are real, and I agree – their reality or non-reality is not causing me any problem, I'm just curious.

I also view epistemic uniformity as pretty important, because we should have the same standards of knowledge across all fields. You seem to think that mathematical knowledge doesn't exist, because mathematical "knowledge" is just what we have derived within a system. I can agree with that! The Benacerraf paper presents a big problem for realism, which you seem to buy - and you're willing to put up with losing semantic uniformity for it. 

I think our differences comes down to how much we want semantic uniformity in a theory of truth of math.

Comment by Alex Hollow on Update 2021-05-31 · 2021-05-31T13:58:57.001Z · LW · GW

Hi - looks like you did a relative link (https://www.lesswrong.com/capital-gains-in-agi-big.png) but you want this absolute link instead: https://www.jefftk.com/capital-gains-in-agi-big.png

Comment by Alex Hollow on Paper Review: Mathematical Truth · 2021-05-31T12:34:07.133Z · LW · GW

Benacerraf convinced me that either mathematical sentences have different logical forms than non-mathematical sentences or that mathematical knowledge has a different form than non-mathematical knowledge. It sounds like your view is that mathematical sentences have different forms (they all have an implicit "within some mathematical system that is relevant, it is provable that..." before them), and also that mathematical knowledge is different (not real knowledge, just exists in a system). In other words, it sounds like you just think that epistemic uniformity and semantic uniformity are not important features of a theory of mathematical truth. That comes down to personal aesthetics and meta-beliefs about how theories should look, so I will just talk about what I think you're saying in this comment.

I think what you're trying to say is that mathematical statements are not true or false in an absolute sense, only true and false within a proof system. and their truth or falsehood is based entirely on whether they can be derived from within that system. 

If that's true, math is just a map, and maps are neither true nor false. If math is just a map, then there is no such thing as objective mathematical truth. So it sounds like you agree that knowledge about any mathematical object is impossible. But when you say that "Epistemic uniformity simply states that math is a useful model", I think that's a little different than what I intended it to mean. Epistemic uniformity says that evaluating the truth-value of a mathematical statement should be a similar process evaluating the truth-value of any other statement.

The issue here is that our non-mathematical statements aren't only internally true or false, they are actually true or false. If you asked someone to justify sentence (1), and they handed you a proof about New York and London, consistent on a set of city-axioms, you would probably be pretty confused. Epistemic uniformity says that a theory of mathematical truth should look relatively like the rest of our theories of truth - why should math be special?

I'm going to take a slight objection to using the phrase "discoverable experimentally" to describe proving a theorem and thinking up numbers, but let's talk about those examples. To me, it sounds like that is doing work within a system of math to determine whether a claim is consistent with axioms. There is some tension here between saying that math is just a tool and thinking that you can do experiments on it to discover facts about the world. No! It will tell us about the tool we are experimenting with. Doing math (under the intutionist paradigm) tells us whether something is provable within a mathematical system, but it has no bearing on whether it is true outside of our minds.

(Side note about intuitionism: 

I think it's important to prevent talking past each other by checking definitions, so I'd like to clarify what you mean by intuitionism. In the definition I'm aware of, intuitionism says roughly that math exists entirely in minds, and the corresponding account of mathematical truth is that a statement is true if someone has a mental construction proving it to be true. Please let me know if this is not what you meant!

My main objection with intuitionism is that it makes a lot of math time-dependent (e.x. 2+2 didn't equal 4 until someone proved it for the first time). Under an intuitionist account of mathematical truth, you can make sentence (2) true by finding three examples that fit. But then that statement's truthhood or falsehood is independent of whether the mathematical fact is really true or false (intuitionists usually don't think there exist universal mathematical truth). It seems to me that math is a real thing in the universe, it was real because humans comprehended it, and it will remain real after humans are gone. That view is incompatible with intuitionism.)

(Another note - can you be a bit more specific about the contradition you think is avoided by giving up Platonism? I think that you still don't have epistemic and semantic uniformity with an intuitionist/combinatorial theory of math)

Comment by Alex Hollow on Why don't long running conversations happen on LessWrong? · 2021-05-30T23:19:05.015Z · LW · GW

I think one issue is that comment trees are just not the ideal format for conversation. It's pretty common that someone will make a comment with four different claims, and then ten different comments of which two make similar objections to one, and then a couple other objections, and those will be responded to, and it's all very ad-hoc. Structure doesn't spontaneously emerge, and having to scan through a whole disordered tree to understand the current state of the argument makes it hard for bystanders to join.

Having a section for posts-for-the-month (or "Ongoing Discussions"?) would help people that want long-running discussions so they would all be in the same place, over only a few threads. But the comment thread discussion format is not great. 

This would be a large engineering effort, but support for argument maps could help record the structure of an argument, and also help discussion continue, as each new node could be a new post, and having a visual of the state of an argument could help keep things organized, compared to comment trees?

Comment by Alex Hollow on How counting neutrons explains nuclear waste · 2021-05-30T21:12:50.912Z · LW · GW

I really enjoyed reading this, and learning about the symmetries between electron shells and nuclear/proton shells. It wasn't clear to me why the nuclear waste in concrete is safe to hug - from a quick google, apparently all matter blocks gamma radiation, and concrete is made out of matter. Concrete is typically used because it is cheap and dense, and often "heavy concrete" made with dense industrial waste material (fly ash, slag, etc) is used for radiation shield. source

Comment by Alex Hollow on Against Being Against Growth · 2021-05-29T21:08:57.462Z · LW · GW

I think it still applies relatively well? If you are trying to limit growth to avoid x-risk, and anyone else is purely maximizing growth, they will grow relative to you, and your growth-limited, x-risk-avoiding faction will become irrelevant in size as the other factions grow. The point about requiring unilateral buy-in still applies: if you want to slow growth to avoid x-risk, you need unilateral buy-in, or you will quickly become irrelevant compared to people maximizing growth.

Comment by Alex Hollow on Against Being Against Growth · 2021-05-29T12:20:58.504Z · LW · GW

I guess it depends on how you think about profit maximization. If eliminating competition increases profit, wouldn't a profit-maximizer want to eliminate competition as well?

Comment by Alex Hollow on Sabien on "work-life" balance · 2021-05-24T12:45:22.342Z · LW · GW

If you don't like doing all the non-programming work that running your own company entails, you might prefer a annoying job that still only requires programming over starting your own company.

Comment by Alex Hollow on Get your gun license · 2021-05-21T15:15:26.959Z · LW · GW

Yeah, exactly! This whole post is meant to suggest the idea of a gun license as a cheap way of being a specific kind of optionality.

Comment by Alex Hollow on Get your gun license · 2021-05-21T15:15:04.179Z · LW · GW

A lot of them have exceptions for items produced before the ban, like the Massachusetts "Assault Weapons Ban" that allows only AR-15 lowers produced before the ban to be used without reduced ergonomics. Also, having a license gives you the option to buy a gun before such a ban goes into effect, if they ban items that you think you would want later! If you see a ban happening, and have to wait a year for your license, you do not have an option to get whatever is being banned before the ban goes through.

Also, procuring guns has lots of downsides - storage cost, danger, theft target, etc. Reducing the time from wanting gun -> getting gun (legally) from over a year to a day gets you most of the way to the goal (which is having the option of buying a gun) without any of the downside (except the $110). So it may be a better deal.

Comment by Alex Hollow on Get your gun license · 2021-05-21T15:12:51.770Z · LW · GW

I edited my post to make it Massachusetts specific - good point on gun accessibility variability. I didn't get into the reasons for owning a gun, because that is something I am personally less sure about and also varies widely from person to person. The main point is meant to be that a license is a cheap way to buy optionality, and I think that holds, although I may try to find some more general examples of times when you might want a gun.

Comment by Alex Hollow on Get your gun license · 2021-05-21T15:09:54.584Z · LW · GW

Great point - edited to update with this. In Massachusetts, you need a license to own a gun or ammo, and that is the same license as a concealed carry license. I definitely over-generalized above. Thanks for pointing that out!

Comment by Alex Hollow on Get your gun license · 2021-05-21T12:56:25.251Z · LW · GW

I didn't say anything about actually owning a gun in this post, only purchasing the right to buy a gun later! I think actually owning a gun has more potential downsides then having the right to own a gun.