A summary of every "Highlights from the Sequences" post

post by Akash (akash-wasil) · 2022-07-15T23:01:04.392Z · LW · GW · 7 comments

Contents

  1
  2
  3
    Thinking Better on Purpose
      The lens that sees its flaws
      What do we mean by “Rationality”?
      Humans are not automatically strategic
      Use the try harder, Luke
      Your Strength as a Rationalist
      The meditation on curiosity
      The importance of saying “Oops”
      The marital art of rationality
      The twelve virtues of rationality
    Pitfalls of Human Cognition
      The Bottom Line
      Rationalization
      You can Face Reality
      Is that your true rejection?
      Avoiding your beliefs’ real weak points
      Belief as Attire
      Dark side epistemology
      Cached Thoughts
      The Fallacy of Gray
      Lonely Dissent
      Positive Bias: Look into the Dark
      Knowing about biases can hurt people
    The Laws Governing Belief
      Making beliefs pay (in anticipated experiences)
      What is evidence?
      Scientific Evidence, Legal Evidence, Rational Evidence
      How much evidence does it take?
      Absence of Evidence is Evidence of Absence
      Conservation of Expected Evidence
      Argument Screens Off Authority
      The Second Law of Thermodynamics, and Engines of Cognition
      Toolbox-thinking and Law-thinking
      Local validity as a key to sanity and civilization
    Science Isn't Enough
      Hindsight devalues science
      Science doesn’t trust your rationality
      When science can’t help
      No safe defense, not even science
    Connecting Words to Reality
      Taboo your words
      Dissolving the question
      Say not “complexity”
      Mind projection fallacy
      How an algorithm works from the inside
      37 ways that words can be wrong
      Expecting short inferential distances
      Illusion of transparency: Why no one understands you
    Why We Fight
      Something to protect
      The gift we give to tomorrow
      On Caring
      Tsuyoku Naritai! (I Want To Become Stronger)
      A sense that more is possible
None
7 comments

1

I recently finished reading Highlights from the Sequences [? · GW], 49 essays from The Sequences [? · GW] that were compiled by the LessWrong team. 

Since moving to Berkeley several months ago, I’ve heard many people talking about posts from The Sequences. A lot of my friends and colleagues commonly reference biases, have a respect for Bayes Rule, and say things like “absence of evidence is evidence of absence!”

So, I was impressed that the Highlights were not merely a refresher of things I had already absorbed through the social waters. There was plenty of new material, and there were also plenty of moments when a concept became much crisper in my head. It’s one thing to know that dissent is hard, and it’s another thing to internalize that lonely dissent doesn’t feel like going to school dressed in black— it feels like going to school wearing a clown suit [? · GW].

2

As I read, I wrote a few sentences summarizing each post. I mostly did this to improve my own comprehension/memory.

You should treat the summaries as "here's what Akash took away from this post" as opposed to "here's an actual summary of what Eliezer said."

Note that the summaries are not meant to replace the posts. Read them here [? · GW].

3

Here are my notes on each post, in order. I also plan to post a reflection on some of my favorite posts.

Thinking Better on Purpose

The lens that sees its flaws [? · GW]

What do we mean by “Rationality”? [? · GW]

Humans are not automatically strategic [? · GW]

Use the try harder, Luke [? · GW]

Your Strength as a Rationalist [? · GW]

The meditation on curiosity [? · GW]

The importance of saying “Oops” [? · GW]

The marital art of rationality [? · GW]

The twelve virtues of rationality [? · GW]

Pitfalls of Human Cognition

The Bottom Line [? · GW]

Rationalization [? · GW]

You can Face Reality [? · GW]

Is that your true rejection? [? · GW]

Avoiding your beliefs’ real weak points [? · GW]

Belief as Attire [? · GW]

Dark side epistemology [? · GW]

Cached Thoughts [? · GW]

The Fallacy of Gray [? · GW]

Lonely Dissent [? · GW]

Positive Bias: Look into the Dark [? · GW]

Knowing about biases can hurt people [? · GW]

The Laws Governing Belief

Making beliefs pay (in anticipated experiences) [? · GW]

What is evidence? [? · GW]

Scientific Evidence, Legal Evidence, Rational Evidence [? · GW]

How much evidence does it take? [? · GW]

Absence of Evidence is Evidence of Absence [? · GW]

Conservation of Expected Evidence [? · GW]

Argument Screens Off Authority [? · GW]

An Intuitive Explanation of Bayes’s Theorem Bayes Rule Guide

The Second Law of Thermodynamics, and Engines of Cognition [? · GW]

  1. There is a relationship between information-processing (your certainty about the state of a system) and thermodynamic entropy (the movement of particles). Entropy must be preserved: If information-theoretic entropy decreases by X, then thermodynamic entropy must increase by X.
  2. If you knew a cup of water was 72 degrees, and then you learned the positions and velocities of the particles, that would decrease the thermodynamic entropy of the water. So the fact that we learned about the water makes the water colder. What???
  3. Also if we could do this, we could make different types of refrigerators & convert warm water into ice cubes by removing electricity. Huh?

Toolbox-thinking and Law-thinking [? · GW]

Local validity as a key to sanity and civilization [? · GW]

Science Isn't Enough

Hindsight devalues science [? · GW]

Science doesn’t trust your rationality [? · GW]

When science can’t help [? · GW]

No safe defense, not even science [? · GW]

Connecting Words to Reality

Taboo your words [? · GW]

Dissolving the question [? · GW]

Say not “complexity” [? · GW]

Mind projection fallacy [? · GW]

How an algorithm works from the inside [? · GW]

37 ways that words can be wrong [? · GW]

Expecting short inferential distances [? · GW]

Illusion of transparency: Why no one understands you [? · GW]

Why We Fight

Something to protect [? · GW]

The gift we give to tomorrow [? · GW]

On Caring [? · GW]

Tsuyoku Naritai! (I Want To Become Stronger) [? · GW]

A sense that more is possible [? · GW]

7 comments

Comments sorted by top scores.

comment by gjm · 2022-07-17T18:23:09.880Z · LW(p) · GW(p)

On the Second Law of Thermodynamics one: Eliezer does something a bit naughty in this one which has the effect of making it look weirder than it is. He says: suppose you have a glass of water, and suppose somehow you magically know the positions and velocities of all the molecules in it. And then he says: "Does that make its thermodynamic entropy zero? Is the water colder, because we know more about it?", and answers yes. But those are two questions, not one question; and while the answer to the first question may be yes (I am not an expert on thermodynamics and I haven't given sufficient thought to what happens to the thermodynamic definition of entropy in this exotic situation), the answer to the second question is not.

The water contains the same amount of heat energy as if you didn't know all that information. It is not colder.

What is different from if you didn't know all that information is that you have, in some sense, the ability to turn the water into ice plus electricity. You can make it cold and get the extracted energy in usable form, whereas without the information you couldn't do that and in fact would have to supply energy on net to make the water cold.

("In some sense" because of course if you gave me a glass of water and a big printout specifying the states of all its molecules I couldn't in fact use that to turn the water into ice plus electrical energy. But in principle, given fancy enough machinery, perhaps I could.)

Another way of looking at it: if I have a glass of room-temperature water and complete information about how its molecules are moving, it's rather like having a mixture of ice and steam containing the same amount of thermal energy, just as if those molecules were physically separated into fast and slow rather than just by giving me a list of where they all are and how they're moving.

Replies from: JBlack
comment by JBlack · 2022-07-18T10:59:46.841Z · LW(p) · GW(p)

Another difference from separated hot & cold reservoirs is that the time horizon for being able to make use of the information is on the order of nanoseconds before the information is useless. Even without quantum messiness and prescribing perfect billiard-ball molecules, just a few stray thermal photons from outside and a few collisions will scramble the speeds and angles hopelessly.

As far as temperature goes it is really undefined, since the energy in the water is not thermal from the point of view of the extremely well informed observer. It has essentially zero entropy, like the kinetic energy of a car or that of a static magnetic field. If you go ahead and try to define it using statistical mechanics anyway, you get a division by zero error: temperature is the marginal ratio of energy to entropy, and the entropy is an unchanging zero regardless of energy.

Replies from: gjm
comment by gjm · 2022-07-18T11:37:28.716Z · LW(p) · GW(p)

I think that last bit only applies if we suppose that you are equipped not only with a complete specification of the state of the molecules but with a constantly instantly updating such specification. Otherwise, if you put more energy in then the entropy will increase too and you can say T = dE/dS just fine even though the initial entropy is zero. (But you make a good point about the energy being not-thermal from our near-omniscient viewpoint.)

Replies from: JBlack
comment by JBlack · 2022-07-20T09:09:17.333Z · LW(p) · GW(p)

If you just have a snapshot state (even with an ideal model of internal interactions from that state) then any thermal contact with the outside will almost instantly raise entropy to near maximum regardless of whether energy is added or removed or on balance unchanged. I don't think it makes sense to talk about temperature there either, since the entropy is not a function of energy and does not co-vary with it in any smooth way.

comment by qazzquimby (torendarby@gmail.com) · 2022-08-09T00:02:22.596Z · LW(p) · GW(p)

Wow I wish I had searched before beginning my own summary project. [LW · GW]

The projects aren't quite interchangeable though. Mine are significantly longer than these, but are intended to be acceptable replacements for the full text, for less patient readers.

comment by Henry Prowbell · 2022-07-18T13:26:14.597Z · LW(p) · GW(p)

Does anybody know if the Highlights From The Sequences are compiled in ebook format anywhere?

Something that takes 7 hours to read, I want to send to my Kindle and read in a comfy chair.

And maybe even have audio versions on a single podcast feed to listen to on my commute.

(Yes, I can print out the list of highlighted posts and skip to those chapters of the full ebook manually but I'm thinking about user experience, the impact of trivial inconveniences, what would make Lesswrong even more awesome.)

comment by Charbel-Raphaël (charbel-raphael-segerie) · 2022-07-17T10:54:09.770Z · LW(p) · GW(p)

Great summary, I've read the full sequences 4 years ago, but this was a nice refreshing.

I also recommend from time to time to go to the concept lists and to focus randomly on 4-5 tags, and to try to generate/remember some thoughts about them.