Which LW / rationalist blog posts aren't covered by my books & courses?
post by iarwain1 · 2015-08-04T22:55:16.414Z · LW · GW · Legacy · 10 commentsContents
10 comments
I've read a few of the Sequences (probably about 50-100 individual posts), but I've only occasionally come away with insights and perspectives that I hadn't already thought of or read elsewhere. I've read a bunch of the popular books on cognitive science and decision theory, including everything on the CFAR popular books list. I'm also about to start an undergrad in statistics with a minor (or possibly a second major) in philosophy.
My question is: Are there specific LW posts / Sequences / other rationalist blog posts that I should read that won't be covered by standard statistics and philosophy courses, or by the books on CFAR's popular reading lists?
10 comments
Comments sorted by top scores.
comment by ScottL · 2015-08-05T02:40:46.790Z · LW(p) · GW(p)
That’s actually a tough question because Elizer/others tend to use new names for existing ideas. For example, 'the fallacy of the grey' instead of 'the continuum fallacy', so I am not entirely sure what concepts have been covered elsewhere. Also, I think a lot of the value from less wrong posts comes from them getting you to think about an idea that you may not otherwise have thought to look deeply into even though technically it may have been in the books you have read. For example, I would never have looked into kent berridges work on wanting and liking if I hadn’t read luke prog’s post on it.
The below list contains some of the concepts that I don't think are covered elsewhere. You can also go through the wikis since you should know what topics you have already learnt:
- Friendly artificial intelligence – is a superintelligence (i.e., a really powerful optimization process) that produces good, beneficial outcomes rather than harmful ones.
- Decision Theories – Theories invented by researchers associated with MIRI and LW: TDT, Timeless Decision Theory, UDT, Updateless Decision Theory and ADT: Ambient Decision Theory (a variant of UDT)
- Affective death spiral - positive attributes of a theory, person, or organization combine with the Halo effect in a feedback loop, resulting in the subject of the affective death spiral being held in higher and higher regard.
- Chronophone – is a parable that is meant to convey the idea that it’s really hard to get somewhere when you don't already know your destination. If there were some simple cognitive policy you could follow to spark moral and technological revolutions, without your home culture having advance knowledge of the destination, you could execute that cognitive policy today.
- Free will (explanation) - means our algorithm's ability to determine our actions. People often get confused over free will because they picture themselves as being restrained rather than part of physics. Yudowsky calls this view Requiredism, but most people just view this essentially as Compatibilism.
- Politics is the Mind-Killer – Politics is not a good area for rational debate. It is often about status and power plays where arguments are soldiers rather than tools to get closer to the truth.
- The map is not the territory – the idea that our perception of the world is being generated by our brain and can be considered as a 'map' of reality written in neural patterns. Reality exists outside our mind but we can construct models of this 'territory' based on what we glimpse through our senses.
- Probability is in the Mind - Probabilities express uncertainty, and it is only agents who can be uncertain. A blank map does not correspond to a blank territory. Ignorance is in the mind.
- Adaptation executors - Individual organisms are best thought of as adaptation-executers rather than as fitness-maximizers. Our taste buds do not find lettuce delicious and cheeseburgers distasteful once we are fed a diet too high in calories and too low in micronutrients. Tastebuds are adapted to an ancestral environment in which calories, not micronutrients, were the limiting factor. Evolution operates on too slow a timescale to re-adapt to adapt to a new conditions (such as a diet).
- Cached thought – is an answer that was arrived at by recalling a previously-computed conclusion, rather than performing the reasoning from scratch.
- Applause light - is an empty statement which evokes positive affect without providing new information
- Belief as attire – is a example of an improper belief promoted by identification with a group or other signaling concerns, not by how well it reflects the territory.
- Belief as cheering - People can bind themselves as a group by believing "crazy" things together. Then among outsiders they could show the same pride in their crazy belief as they would show wearing "crazy" group clothes among outsiders. The belief is more like a banner saying "GO BLUES". It isn't a statement of fact, or an attempt to persuade; it doesn't have to be convincing—it's a cheer.
- Belief in belief I believe this is dennet’s idea - Where it is difficult to believe a thing, it is often much easier to believe that you ought to believe it. Were you to really believe and not just believe in belief, the consequences of error would be much more severe. When someone makes up excuses in advance, it would seem to require that belief, and belief in belief, have become unsynchronized.
- Counter man syndrome - wherein a person behind a counter comes to believe that they know things they don't know, because, after all, they're the person behind the counter. So they can't just answer a question with "I don't know"... and thus they make something up, without really paying attention to the fact that they're making it up. Pretty soon, they don't know the difference between the facts and their made up stories
- Crisis of faith - a combined technique for recognizing and eradicating the whole systems of mutually-supporting false beliefs. The technique involves systematic application of introspection, with the express intent to check the reliability of beliefs independently of the other beliefs that support them in the mind. The technique might be useful for the victims of affective death spirals, or any other systematic confusions, especially those supported by anti-epistemology.
- Making Beliefs Pay Rent - Every question of belief should flow from a question of anticipation, and that question of anticipation should be the centre of the inquiry. Every guess of belief should begin by flowing to a specific guess of anticipation, and should continue to pay rent in future anticipations. If a belief turns deadbeat, evict it.
- Least convenient possible world – is a technique for enforcing intellectual honesty, to be used when arguing against an idea. The essence of the technique is to assume that all the specific details will align with the idea against which you are arguing, i.e. to consider the idea in the context of a least convenient possible world, where every circumstance is colluding against your objections and counterarguments. This approach ensures that your objections are strong enough, running minimal risk of being rationalizations for your position.
- Rationalist taboo - a technique for fighting muddles in discussions. By prohibiting the use of a certain word and all the words synonymous to it, people are forced to elucidate the specific contextual meaning they want to express, thus removing ambiguity otherwise present in a single word. Mainstream philosophy has a parallel procedure called "unpacking" where doubtful terms need to be expanded out.
- Semantic stopsign – is a meaningless generic explanation that creates an illusion of giving an answer, without actually explaining anything.
↑ comment by ScottL · 2015-08-05T02:41:25.932Z · LW(p) · GW(p)
Some more:
- Shut up and multiply- the ability to trust the math even when it feels wrong
- Most of science is actually done by induction - To come up with something worth testing, a scientist needs to do lots of sound induction first or borrow an idea from someone who already used induction. This is because induction is the only way to reliably find candidate hypotheses which deserve attention. Examples of bad ways to find hypotheses include finding something interesting or surprising to believe in and then pinning all your hopes on that thing turning out to be true.
- Twelve virtues of rationality
- Curiosity – the burning itch
- Relenquishment – “That which can be destroyed by the truth should be.” -P. C. Hodgell
- Lightness – follow the evidence wherever it leads
- Evenness – resist selective skepticism; use reason, not rationalization
- Argument – do not avoid arguing; strive for exact honesty; fairness does not mean balancing yourself evenly between propositions
- Empiricism – knowledge is rooted in empiricism and its fruit is prediction; argue what experiences to anticipate, not which beliefs to profess
- Simplicity – is virtuous in belief, design, planning, and justification; ideally: nothing left to take away, not nothing left to add
- Humility – take actions, anticipate errors; do not boast of modesty; no one achieves perfection
- Perfectionism – seek the answer that is perfectly right – do not settle for less
- Precision – the narrowest statements slice deepest; don’t walk but dance to the truth
- Scholarship – absorb the powers of science
- [The void] (the nameless virtue) – “More than anything, you must think of carrying your map through to reflecting the territory.”
- Oops - Theories must be bold and expose themselves to falsification; be willing to commit the heroic sacrifice of giving up your own ideas when confronted with contrary evidence; play nice in your arguments; try not to deceive yourself; and other fuzzy verbalisms. It is better to say oops quickly when you realize a mistake. The alternative is stretching out the battle with yourself over years.
- Explaining vs. explaining away – Explaining something does not subtract from its beauty. It in fact heightens it. Through understanding it, you gain greater awareness of it. Through understanding it, you are more likely to notice its similarities and interrelationships with others things. Through understanding it, you become able to see it not only on one level, but on multiple. In regards to the delusions which people are emotionally attached to, that which can be destroyed by the truth should be.
- Ugh field - Pavlovian conditioning can cause humans to unconsciously flinch from even thinking about a serious personal problem they have. We call it an "ugh field". The ugh field forms a self-shadowing blind spot covering an area desperately in need of optimization.
- Privileging the question - questions that someone has unjustifiably brought to your attention in the same way that a privileged hypothesis unjustifiably gets brought to your attention. Examples are: should gay marriage be legal? Should Congress pass stricter gun control laws? Should immigration policy be tightened or relaxed? The problem with privileged questions is that you only have so much attention to spare. Attention paid to a question that has been privileged funges against attention you could be paying to better questions. Even worse, it may not feel from the inside like anything is wrong: you can apply all of the epistemic rationality in the world to answering a question like "should Congress pass stricter gun control laws?" and never once ask yourself where that question came from and whether there are better questions you could be answering instead.
- Something to protect - The Art must have a purpose other than itself, or it collapses into infinite recursion.
- Take joy in the merely real – If you believe that science coming to know about something places it into the dull catalogue of common things, then you're going to be disappointed in pretty much everything eventually —either it will turn out not to exist, or even worse, it will turn out to be real. Another way to think about it is that if the magical and mythical were common place they would be merely real. If dragons were common, but zebras were a rare legendary creature then there's a certain sort of person who would ignore dragons, who would never bother to look at dragons, and chase after rumors of zebras. The grass is always greener on the other side of reality. If we cannot take joy in the merely real, our lives shall be empty indeed.
- Complexity of value - the thesis that human values have high Kolmogorov complexity and so cannot be summed up or compressed into a few simple rules. It includes the idea of fragility of value which is the thesis that losing even a small part of the rules that make up our values could lead to results that most of us would now consider as unacceptable.
- Egan's law - "It all adds up to normality." — Greg Egan. The purpose of a theory is to add up to observed reality, rather than something else. Science sets out to answer the question "What adds up to normality?" and the answer turns out to be Quantum mechanics adds up to normality. A weaker extension of this principle applies to ethical and meta-ethical debates, which generally ought to end up explaining why you shouldn't eat babies, rather than why you should.
- Emotion - Contrary to the stereotype, rationality doesn't mean denying emotion. When emotion is appropriate to the reality of the situation, it should be embraced; only when emotion isn't appropriate should it be suppressed.
- Litany of Gendlin – “What is true is already so. Owning up to it doesn't make it worse. Not being open about it doesn't make it go away. And because it's true, it is what is there to be interacted with. Anything untrue isn't there to be lived. People can stand what is true, for they are already enduring it.” —Eugene Gendlin
- Litany of Tarski – “If the box contains a diamond, I desire to believe that the box contains a diamond; If the box does not contain a diamond, I desire to believe that the box does not contain a diamond; Let me not become attached to beliefs I may not want. “ —The Meditation on Curiosity
- Magic - What seems to humans like a simple explanation, sometimes isn't at all. In our own naturalistic, reductionist universe, there is always a simpler explanation. Any complicated thing that happens, happens because there is some physical mechanism behind it, even if you don't know the mechanism yourself (which is most of the time). There is no magic.
- Words can be wrong – There are many ways that words can be wrong it is for this reason that we should avoid arguing by definition. Instead, to facilitate communication we can taboo and reduce: we can replace the symbol with the substance and talk about facts and anticipations, not definitions.
comment by IffThen · 2015-08-06T21:40:57.106Z · LW(p) · GW(p)
Slightly off from what you asked, but the CFAR list looks suboptimal. I would add The Invisible Gorilla (And Other Ways Our Intuitions Deceive Us) by Christopher Chabris and Daniel Simons. It is more thorough and more generally applicable than Predictably Irrational.
If people have other recommendations for books that are better than (or highly complementary to) the books on the CFAR lists, I would be interested in hearing them.
comment by [deleted] · 2015-08-06T03:30:46.889Z · LW(p) · GW(p)
Some career advice: skip the philosophy minor. Minors are only worth it if they're related to an area you're considering for grad school. Biology or math minors give you a lot of options although not math in your case obviously. If you want to take those classes as electives and just end up with it then fine, but don't go into it planning on doing extra coursework because you think the minor will help you; it won't.
Replies from: None↑ comment by [deleted] · 2015-08-07T07:55:33.586Z · LW(p) · GW(p)
Minors are only worth it if they're related to an area you're considering for grad school.
Philosophy will be fun until you realize that 80% or so of what you are studying is dusty old stuff from times in which things like Bayesianism, Quantum Mechanics and Evolution were unknown or ignored. That position is ignorant culturally, but sometimes you are probably better off ignoring the artificial boundaries and hurdles that our culture sets up and go straight to the interesting things.
comment by [deleted] · 2015-08-05T01:13:32.805Z · LW(p) · GW(p)
I'd recommend reading the academic books that CFAR lists, that will get you further than reading more blog posts, or more popular books.
Replies from: Vaniver↑ comment by Vaniver · 2015-08-05T13:51:35.890Z · LW(p) · GW(p)
That's missing the point of this question, which is "if I read all the academic books on CFAR's list, are there blog posts that contain things I will not have seen in the books?"
Replies from: Raziel123, None