What are the axioms of rationality?

post by Yoav Ravid · 2018-12-25T06:47:54.363Z · score: 1 (None votes) · LW · GW · 8 comments

This is a question post.


I'm new here (my first post), i just started to get serious about rationality, and one of the questions that immediately came to my mind is "What are the axioms of rationality?". I looked it up a bit, and didn't find (even on this site) a post that'll show them (and i'm quite sure there are).

So this is intended as a discussion, And I'll make a post with the conclusions afterward.

curious to see your reply's! (as well if you have feedback on how i asked the question)

thanks :)

answer by abramdemski · 2018-12-30T22:26:23.159Z · score: 18 (None votes)

There are a variety of axiom systems which justify mostly similar notions of rationality, and a few posts explore these axiom systems. Sniffnoy summarized Savage's Axioms. I summarized some approaches and why I think it may be possible to do better. I wrote in detail about complete class theorems. I also took a look at consequences of the jeffrey-bolker axioms. (Jeffrey-bolker and complete class are my two favorite ways to axiomatize things, and they have some very different consequences!)

As many others are emphasizing, these axiomatic approaches don't really summarize rationality-as-practiced, although they are highly connected. Actually, I think people are kind of downplaying the connection. Although de-biasing moves such as de-anchoring aren't usually justified by direct appeal to rationality axioms, it is possible to flesh out that connection, and doing this with enough things will likely improve your decision-theoretic thinking.


1) The fact that there are many alternative axiom systems, and that we can judge them for various good/bad features, illustrates that one set of axioms doesn't capture the whole of rationality (at least, not yet).

2) The fact that not even the sequences deal much with these axioms shows that they need not be central to a practice of rationality. Thoroughly understanding probability and expected utility as calculations, and understanding that there are strong arguments for these calculations in particular is more important.

answer by Said Achmiz · 2018-12-25T15:41:28.552Z · score: 16 (None votes)

“Rationality” isn’t the kind of thing that has “axioms”.

Reading the Sequences may help you.


comment by Mindey · 2018-12-26T13:24:28.369Z · score: 2 (None votes) · LW · GW

Rationality has no axioms, just heuristics and rules for different environments. In other words, rationality is a solution to a problem (optimality of thinking and deciding) to reason within a domain, but because of the diversity of domains, it is not axiomatizable to a single specific set of axioms. I suppose best one can do given arbitrary domain, is to say: maybe try exploring.

comment by Elo · 2018-12-25T08:57:09.388Z · score: 2 (None votes) · LW · GW

You might like to look up the 12 virtues as written by EY. http://yudkowsky.net/rational/virtues/

comment by Yoav Ravid · 2018-12-25T14:47:32.172Z · score: 1 (None votes) · LW · GW

Thanks, i actually read it already and (though i really liked it) it didn't answer my question. can you elaborate on where you find answers to my question in there? :)

one thing that does seem to me like an axiom and was in there is "what could be destroyed by the truth should be"

comment by avturchin · 2018-12-27T21:08:31.603Z · score: 0 (None votes) · LW · GW

It would be interesting to attempt to create a list of such axioms.

First is a meta-axiom: The rationality could be presented as a finite set of finite rules. Or, in other words: The best winning meta-strategy do exists and it could be presented as a finite set of rules.

Note, that this may be not true. For example, the best winning strategy in chess is AlphaZero which is a neural net, but not a set of rules how to win. Ok, any neural net could be probably presented as enormously large set of billion of rules, but this is not we mean here: we mean that the rationality as a set of thinking rules is small enough for a single human person to know it in explicit way. This why there is 12 virtues, but not 120000 virtues of rationality.

The next step is to add here a VNM rationality axioms https://en.wikipedia.org/wiki/Von_Neumann%E2%80%93Morgenstern_utility_theorem

Another axiom should say something about Bayes' theorem as a main epistemic principle + another one about Kolmogorov complexity.

Occam razor could be a theorem following from above.

Plus there should be a block about decision theory, something like UDT.