# What are the axioms of rationality?

post by Yoav Ravid · 2018-12-25T06:47:54.363Z · LW · GW · 8 commentsThis is a question post.

I'm new here (my first post), i just started to get serious about rationality, and one of the questions that immediately came to my mind is "What are the axioms of rationality?". I looked it up a bit, and didn't find (even on this site) a post that'll show them (and i'm quite sure there are).

So this is intended as a discussion, And I'll make a post with the conclusions afterward.

curious to see your reply's! (as well if you have feedback on how i asked the question)

thanks :)

## Answers

There are a variety of axiom systems which justify *mostly* similar notions of rationality, and a few posts explore these axiom systems. Sniffnoy summarized Savage's Axioms [LW · GW]. I summarized some approaches [LW · GW] and why I think it may be possible to do better. I wrote in detail about complete class theorems [LW · GW]. I also took a look at consequences of the jeffrey-bolker axioms [LW · GW]. (Jeffrey-bolker and complete class are my two favorite ways to axiomatize things, and they have some very different consequences!)

As many others are emphasizing, these axiomatic approaches don't really summarize rationality-as-practiced, although they *are* highly connected. Actually, I think people are kind of downplaying the connection. Although de-biasing moves such as de-anchoring [LW · GW] aren't usually justified by direct appeal to rationality axioms, it is possible to flesh out that connection, and doing this with enough things will likely improve your decision-theoretic thinking.

Still:

1) The fact that there are many alternative axiom systems, and that we can judge them for various good/bad features, illustrates that one set of axioms doesn't capture the whole of rationality (at least, not yet).

2) The fact that not even the sequences deal much with these axioms shows that they need not be central to a practice of rationality. Thoroughly understanding probability and expected utility *as calculations, *and understanding that there *are* strong arguments for these calculations in particular is more important.

“Rationality” isn’t the kind of thing that has “axioms”.

Reading the Sequences may help you.

## ↑ comment by Yoav Ravid · 2018-12-26T06:51:49.932Z · LW(p) · GW(p)

this is something i thought of, that rationality shouldn't have any axioms, but then wasn't really sure that it hadn't (cause it felt to me that it does). so i asked the question to see what people think.

for example, how do you view this phrase "what could be destroyed by the truth should be"? is it an axiom?

Replies from: SaidAchmiz## ↑ comment by Said Achmiz (SaidAchmiz) · 2018-12-26T08:27:39.518Z · LW(p) · GW(p)

for example, how do you view this phrase “what could be destroyed by the truth should be”? is it an axiom?

No. It’s an expression of one of what Eliezer calls the “virtues of rationality”. But the virtues *aren’t* axioms—which, incidentally, is what the twelfth and final virtue is all about.

this is something i thought of, that rationality shouldn’t have any axioms

It not only *shouldn’t*, it *can’t*—as I said, *rationality just isn’t the sort of thing* that has “axioms”. (Namely, it’s not a *formal system*.)

## 8 comments

Comments sorted by top scores.

## comment by Mindey · 2018-12-26T13:24:28.369Z · LW(p) · GW(p)

Rationality has no axioms, just heuristics and rules for different environments. In other words, rationality is a solution to a problem (optimality of thinking and deciding) to reason within a domain, but because of the diversity of domains, it is not axiomatizable to a single specific set of axioms. I suppose best one can do given arbitrary domain, is to say: maybe try exploring.

## comment by Elo · 2018-12-25T08:57:09.388Z · LW(p) · GW(p)

You might like to look up the 12 virtues as written by EY. http://yudkowsky.net/rational/virtues/

Replies from: Yoav Ravid## ↑ comment by Yoav Ravid · 2018-12-25T14:47:32.172Z · LW(p) · GW(p)

Thanks, i actually read it already and (though i really liked it) it didn't answer my question. can you elaborate on where you find answers to my question in there? :)

one thing that does seem to me like an axiom and was in there is "what could be destroyed by the truth should be"

## comment by avturchin · 2018-12-27T21:08:31.603Z · LW(p) · GW(p)

It would be interesting to attempt to create a list of such axioms.

First is a *meta-axiom*: The rationality could be presented as a finite set of finite rules. Or, in other words: The best winning meta-strategy do exists and it could be presented as a finite set of rules.

Note, that this may be not true. For example, the best winning strategy in chess is AlphaZero which is a neural net, but not a set of rules how to win. Ok, any neural net could be probably presented as enormously large set of billion of rules, but this is not we mean here: we mean that the rationality as a set of thinking rules is small enough for a single human person to know it in explicit way. This why there is 12 virtues, but not 120000 virtues of rationality.

The next step is to add here a VNM rationality axioms https://en.wikipedia.org/wiki/Von_Neumann%E2%80%93Morgenstern_utility_theorem

Another axiom should say something about Bayes' theorem as a main epistemic principle + another one about Kolmogorov complexity.

Occam razor could be a theorem following from above.

Plus there should be a block about decision theory, something like UDT.