Posts

Entropy and Temperature 2014-12-17T08:04:38.208Z · score: 28 (28 votes)

Comments

Comment by spxtr on January 2016 Media Thread · 2016-01-02T21:19:44.343Z · score: 2 (2 votes) · LW · GW

Visual Information Theory. I was already comfortable with information theory and this was still informative. This blogger's other posts are similarly high-quality.

Comment by spxtr on January 2016 Media Thread · 2016-01-02T21:11:33.712Z · score: 1 (1 votes) · LW · GW

Ghost - Meliora

Clean production, catchy melodies, interesting lyrics, and so on. It doesn't get old either. My favorite songs are Cirice and He Is. I would have confused the latter for a catchy Christian metal song if not for the lyrics. I mean, I guess it still is Christian metal, just not the usual way around.

Comment by spxtr on January 2016 Media Thread · 2016-01-02T21:02:06.073Z · score: 4 (4 votes) · LW · GW

In the end, it is just another Abrams movie: slick, SFX-heavy, and as substantial & satisfying as movie theater popcorn.

Yep.

You might want to add a spoiler note at the top, though.

Comment by spxtr on July 2015 Media Thread · 2015-07-03T04:15:28.150Z · score: 1 (1 votes) · LW · GW

God is an Astronaut put out a new album recently called Helios | Erebus. Some songs hit harder than All is Violent, but it's otherwise similar. I highly recommend.

Comment by spxtr on Agency is bugs and uncertainty · 2015-06-08T04:20:46.113Z · score: 1 (1 votes) · LW · GW

It might be wishful thinking, but I feel like my smash experience improved my meatspace-agency as well.

Comment by spxtr on Agency is bugs and uncertainty · 2015-06-06T05:37:32.310Z · score: 13 (13 votes) · LW · GW

Story time! Shortly after Brawl came out, I got pretty good at it. I could beat all my friends without much effort, so I decided to enter in a local tournament. In my first round I went up against the best player in my state, and I managed to hit him once, lightly, over the course of two games. I later became pretty good friends and practiced with him regularly.

At some point I completely eclipsed my non-competitive friends, to the extent that playing with them felt like a chore. All I had to do was put them in certain situations where I knew how they would react and then punish. It became a simple algorithm. Get behind them in shield, wait for the roll, punish. Put them in the air, jump after them, wait for the airdodge, punish. Throw them off the ledge, wait for the jump, punish. It felt like I was playing a CPU.

Meanwhile, I still couldn't reliably beat the best player from my state. One day, after he took off a particularly gruesome stock, I paused and, exasperated, asked for advice. We watched a replay and he showed me how I responded to certain situations in the same way every time, leading to a punish. My habits were less obvious than those of my friends, but they were still habits. He said, "you play like a robot, in a bad way."

So yeah. In that context, I've downgraded friends to CPUs because of their predictability, and been downgraded to a CPU by omega, because of my predictability.

Comment by spxtr on June 2015 Media Thread · 2015-06-02T05:12:34.088Z · score: 1 (1 votes) · LW · GW

Peste Noire - La Chaise-Dyable. French black metal.

Comment by spxtr on On immortality · 2015-04-11T03:15:10.511Z · score: 3 (3 votes) · LW · GW

An exact copy of me may be "me" from an identity perspective, but it is a separate entity from a utilitarian perspective. The death of one is still a tragedy, even if the other survives.

You should know this intuitively. If a rogue trolley is careening toward an unsuspecting birthday cake, you'll snatch it out of the way. You won't just say, "eh, in another time that cake will survive," and then watch it squish. Unless you're some sort of monster.

Comment by spxtr on Rationality: From AI to Zombies · 2015-03-16T07:49:42.190Z · score: 7 (7 votes) · LW · GW

I am impressed. The production quality on this is excellent, and the new introduction by Rob Bensinger is approachable for new readers. I will definitely be recommending this over the version on this site.

Comment by spxtr on Harry Potter and the Methods of Rationality discussion thread, March 2015, chapter 120 · 2015-03-13T02:23:31.867Z · score: 9 (9 votes) · LW · GW

I didn't want to tell it to you before because I thought it might prejudice your decision unfairly.

If Draco has has the last half-hour of his memory sealed off, then why does Harry say these words to him? Shouldn't Draco respond, "What decision?"

Unless it's a more nuanced memory charm, such that he only subconsciously remembers the conversation.

Comment by spxtr on [LINK] The Wrong Objections to the Many-Worlds Interpretation of Quantum Mechanics · 2015-02-19T20:54:01.518Z · score: 3 (3 votes) · LW · GW

If you have a different version of QM (perhaps what Ted Bunn has called a “disappearing-world” interpretation), it must somehow differ from MWI, presumably by either changing the above postulates or adding to them. And in that case, if your theory is well-posed, we can very readily test those proposed changes. In a dynamical-collapse theory, for example, the wave function does not simply evolve according to the Schrödinger equation; it occasionally collapses (duh) in a nonlinear and possibly stochastic fashion. And we can absolutely look for experimental signatures of that deviation, thereby testing the relative adequacy of MWI vs. your collapse theory.

He asserts that such an experiment exists. I would love it if he were to expand on this assertion.

Comment by spxtr on An alarming fact about the anti-aging community · 2015-02-18T20:28:47.923Z · score: 1 (1 votes) · LW · GW

I recommend reading the sequences, if you haven't already. In particular, the fun theory sequence discusses exactly these issues.

Comment by spxtr on An alarming fact about the anti-aging community · 2015-02-18T08:54:12.271Z · score: 0 (0 votes) · LW · GW

Welcome! LessWrong is generally anti-Death. See HPMoR or The Fable of the Dragon-Tyrant.

Comment by spxtr on Open thread, Feb. 16 - Feb. 22, 2015 · 2015-02-17T23:53:12.785Z · score: 1 (1 votes) · LW · GW

This is a little misleading. Feynman diagrams are simple, sure, but they represent difficult calculations that weren't understood at the time he invented them. There was certainly genius involved, not just perseverance.

Much more likely his IQ result was unreliable, as gwern thinks.

Comment by spxtr on Open thread, Feb. 16 - Feb. 22, 2015 · 2015-02-17T19:53:24.324Z · score: 13 (13 votes) · LW · GW

Feynman was younger than 15 when he took it, and very near this factoid in Gleick's bio, he recounts Feynman asking about very basic algebra (2^x=4) and wondering why anything found it hard - the IQ is mentioned immediately before the section on 'grammar school', or middle school, implying that the 'school IQ test' was done well before he entered high school, putting him at much younger than 15. (15 is important because Feynman had mastered calculus by age 15, Gleick says, so he wouldn't be asking his father why algebra is useful at age >15.) - Given that Feynman was born in 1918, this implies the IQ test was done around 1930 or earlier. Given that it was done by the New York City school district, this implies also that it was one of the 'ratio' based IQ tests - utterly outdated and incorrect by modern standards. - Finally, it's well known that IQ tests are very unreliable in childhood; kids can easily bounce around compared to their stable adult scores.

So, it was a bad test, which even under ideal circumstances is unreliable & prone to error, and administered in a mass fashion and likely not by a genuine psychometrician.

-- gwern

Comment by spxtr on Quotes Repository · 2015-02-10T04:59:04.125Z · score: 5 (5 votes) · LW · GW

The idea is that it's not specifically for quotes related to rationality or other LessWrong topics.

Comment by spxtr on Open Thread, Feb. 2 - Feb 8, 2015 · 2015-02-02T10:25:33.273Z · score: 1 (1 votes) · LW · GW

Then you put the tea and the water in thermal contact. Now, for every possible microstate of the glass of water, the combined system evolves to a single final microstate (only one, because you know the exact state of the tea).

After you put the glass of water in contact with the cup of tea, you will quickly become uncertain about the state of the tea. In order to still know the microstate, you need to be fed more information.

Comment by spxtr on The Role of Attractiveness in Mate Selection: Individual Variation · 2015-01-27T04:10:17.696Z · score: 1 (1 votes) · LW · GW

That's probably it. When fitting a line using MCMC you'll get an anticorrelated blob of probabilities for slope and intercept, and if you plot one deviation in the fit parameters you get something that looks like this. I'd guess this is a non-parametric analogue of that. Notice how both grow significantly at the edges of the plots.

Comment by spxtr on Open thread, Jan. 26 - Feb. 1, 2015 · 2015-01-27T03:40:51.318Z · score: 8 (8 votes) · LW · GW

Quantum mysticism written on a well-known and terrible MRA blog? -8 seems high. See the quantum sequence if you haven't already. It looks like advancedatheist and ZankerH got some buddies to upvote all of their comments, though. They all jumped by ~12 in the last couple hours.

For real, though, this is actually useless and deserves a very low score.

Comment by spxtr on What are the resolution limits of medical imaging? · 2015-01-26T00:58:15.722Z · score: 3 (3 votes) · LW · GW

Super-resolution microscopy is an interesting recent development that won the Nobel Prize in Chemistry last year. Here's another article on the subject. It has been used to image mouse brains, but only near the surface. It won't be able to view the interior of any brain, but still interesting.

Comment by spxtr on The Role of Attractiveness in Mate Selection: Individual Variation · 2015-01-25T21:16:52.413Z · score: 1 (1 votes) · LW · GW

What's the shaded area in the very first plot? Usually this area is one deviation around the fit line, but here it's clearly way too small to be that.

Comment by spxtr on Entropy and Temperature · 2015-01-22T20:00:24.680Z · score: 1 (1 votes) · LW · GW

You probably know this, but average energy per molecule is not temperature at low temperatures. Quantum kicks in and that definition fails. dS/dE never lets you down.

Comment by spxtr on Entropy and Temperature · 2015-01-22T04:07:26.470Z · score: 0 (0 votes) · LW · GW

It won't be ice. Ice has a regular crystal structure, and if you know the microstate you know that the water molecules aren't in that structure.

Comment by spxtr on Open thread, Jan. 19 - Jan. 25, 2015 · 2015-01-21T20:41:37.342Z · score: 0 (0 votes) · LW · GW

Expanding on the billiard ball example: lets say one part of the wall of the pool table adds some noise to the trajectory of the balls that bounce off of that spot, but doesn't sap energy from them on average. After a while we won't know the exact positions of the balls at an arbitrary time given only their initial positions and momenta. That is, entropy has entered our system through that part of the wall. I know this language makes it sound like entropy is in the system, flowing about, but if we knew the exact shape of the wall at that spot then it wouldn't happen.

Even with this entropy entering our system, the energy remains constant. This is why total energy is a wonderful macrovariable for this system. Systems where this works are usually easily solved as a microcanonical ensemble. If, instead, that wall spot was at a fixed temperature, we would use the canonical ensemble.

Comment by spxtr on Open thread, Jan. 19 - Jan. 25, 2015 · 2015-01-21T06:32:11.886Z · score: 0 (0 votes) · LW · GW

Probability is in the Mind.

Comment by spxtr on Open thread, Jan. 19 - Jan. 25, 2015 · 2015-01-21T05:01:33.030Z · score: 1 (1 votes) · LW · GW

An easy toy system is a collection of perfect billiard balls on a perfect pool table, that is, one without rolling friction and where all collisions conserve energy. For a few billiard balls it would be quite easy to extract all of their energy as work if you know their initial positions and velocities. There are plenty of ways to do it, and it's fun to think of them. This means they are at 0 temperature.

If you don't know the microstate, but you do know the sum of the square of their velocities, which is a constant in all collisions, you can still tell some things about the process. For instance, you can predict the average number of collisions with one wall and the corresponding energy, related to the pressure. If you stick your hand on the table for five seconds, what is the chance you get hit by a ball moving faster than some value that will cause pain? All these things are probabilistic.

In the limit of tiny billiard balls compared to pool table size, this is the ideal gas.

Comment by spxtr on Open thread, Jan. 19 - Jan. 25, 2015 · 2015-01-21T04:53:49.507Z · score: 0 (0 votes) · LW · GW

Entropy is in the mind in exactly the same sense that probability is in the mind. See the relevant Sequence post if you don't know what that means.

The usual ideal gas model is that collisions are perfectly elastic, so even if you do factor in collisions they don't actually change anything. Interactions such as van der Waals have been factored in. The ideal gas approximation should be quite close to the actual value for gases like Helium.

Comment by spxtr on Open thread, Jan. 19 - Jan. 25, 2015 · 2015-01-19T21:19:25.924Z · score: 3 (3 votes) · LW · GW

I agree with passive_fist, and my argument hasn't changed since last time.

If we learn that energy changes in some process, then we are wrong about the laws that the system is obeying. If we learn that entropy goes down, then we can still be right about the physical laws, as Jaynes shows.

Another way: if we know the laws, then energy is a function of the individual microstate and nothing else, while entropy is a function of our probability distribution over the microstates and nothing else.

Comment by spxtr on Open thread, Jan. 19 - Jan. 25, 2015 · 2015-01-19T06:52:16.718Z · score: 2 (2 votes) · LW · GW

I made a post about this a month or so ago. Yay!

Comment by spxtr on Some recent evidence against the Big Bang · 2015-01-08T04:32:19.735Z · score: 0 (0 votes) · LW · GW

Great! It will certainly be accepted for publication in a peer-reviewed journal. The author will most likely win a Nobel Prize for his work and be hired to work at the top institution of his choice.

Comment by spxtr on Some recent evidence against the Big Bang · 2015-01-07T08:42:17.080Z · score: 2 (4 votes) · LW · GW

At this point I have to stop and ask for your credentials in astronomy. The link you posted reeks strongly of crackpot, and it's most likely not worth my time to study. Maybe you've studied cosmology in detail and think differently? If you think the author is wrong about their pet theory of general relativity, why do you think they're right in their disproof of LCDM?

Comment by spxtr on Some recent evidence against the Big Bang · 2015-01-07T06:24:26.264Z · score: 2 (2 votes) · LW · GW

Astronomy is extremely difficult. We don't know the relevant fundamental physics, and we can't perform direct experiments on our subjects. We should expect numerous problems with any cosmological model that we propose at this point. The only people who are certain of their cosmologies are the religious.

You need to do a lot more work for this sort of post to be useful. Cherry-picking weak arguments spread across the entire field of astronomy isn't enough.

Comment by spxtr on January 2015 Media Thread · 2015-01-01T05:53:00.735Z · score: 2 (2 votes) · LW · GW

Best songs of 2014.

Rock:

Pop:

Post-rock:

Metal:

Comment by spxtr on Entropy and Temperature · 2014-12-23T07:45:45.737Z · score: 1 (3 votes) · LW · GW

I'd think that probability is either in the map or in the territory

Even if Copenhagen is right, I as a rational agent should still ought to use mind-probabilities. It may be the case that the quantum world is truly probabilistic-in-the-territory, but that doesn't affect the fact that I don't know the state of any physical system precisely.

Comment by spxtr on Entropy and Temperature · 2014-12-23T07:42:57.709Z · score: 2 (2 votes) · LW · GW

If we inject some air into a box then close our eyes for a few seconds and shove in a partition, there is a finite chance of finding the nitrogen on one side and the oxygen on the other. Entropy can decrease, and that's allowed by the laws of physics. The internal energy had better not change, though. That's disallowed.

If energy changes, our underlying physical laws need to be reexamined. In the official dogma, these are firmly in the territory. If entropy goes down, our map was likely wrong.

Comment by spxtr on Entropy and Temperature · 2014-12-19T22:37:04.460Z · score: 0 (0 votes) · LW · GW

The rule that all microstates that are consistent with a given macrostate are equally probable is a consequence of the maximum entropy principle. See this Jaynes paper.

Comment by spxtr on Entropy and Temperature · 2014-12-19T22:35:49.428Z · score: 2 (2 votes) · LW · GW

Omega tells us the state of the water at time T=0, when we put the beer into it. There are two ways of looking at what happens immediately after.

The first way is that the water doesn't flow heat into the beer, rather it does some work on it. If we know the state of the beer/water interface as well then we can calculate exactly what will happen. It will look like quick water molecules thumping into slow boundary molecules and doing work on them. This is why the concept of temperature is no longer necessary: if we know everything then we can just do mechanics. Unfortunately, we don't know everything about the full system, so this won't quite work.

Think about your uncertainty about the state of the water as you run time forward. It's initially zero, but the water is in contact with something that could be in any number of states (the beer), and so the entropy of the water is going to rise extremely quickly.

The water will initially be doing work on the beer, but after an extremely short time it will be flowing heat into it. One observer's work is another's heat, essentially.

Comment by spxtr on Entropy and Temperature · 2014-12-19T08:46:51.362Z · score: 0 (0 votes) · LW · GW

That doesn't follow.

Comment by spxtr on Entropy and Temperature · 2014-12-19T07:23:29.152Z · score: 2 (2 votes) · LW · GW

Say we have some electron in equal superposition of spin up and down, and we measure it.

In Copenhagen, the universe decides that it's up or down right then and there, with 50% probability. This isn't in the mind, since it's not a product of our ignorance. It can't be, because of Bell stuff.

In many-worlds, the universe does a deterministic thing and one branch measures spin up, one measures spin down. The probability is in my mind, because it's a product of my ignorance - I don't know what branch I'm in.

Comment by spxtr on Entropy and Temperature · 2014-12-19T07:09:37.621Z · score: 1 (1 votes) · LW · GW

We don't lose those things. Remember, this isn't my definition. This is the actual definition of temperature used by statistical physicists. Anything statistical physics predicts (all of the things you listed) is predicted by this definition.

You're right though. If you know the state of the molecules in the water then you don't need to think about temperature. That's a feature, not a bug.

Comment by spxtr on Entropy and Temperature · 2014-12-19T04:12:54.325Z · score: 2 (2 votes) · LW · GW

Energy is a feature of the territory because it's a function of position and momenta and other territory-things.

"Capacity to transfer heat" can mean a few things, and I'm not sure which you want. There's already heat capacity, which is how much actual energy is stored per degree of temperature. To find the total internal heat energy you just have to integrate this up to the current temperature. The usual example here is an iceberg which stores much more heat energy than a cup of coffee, and yet heat flows from the coffee to the iceberg. If you mean something more like "quantity that determines which direction heat will flow between two systems," then that's just the definition I presented :p

I actually have trouble defending "probability is in the mind" in some physics contexts without invoking many-worlds. If it turns out that many-worlds is wrong and copenhagen, say, is right, then it will be useful to believe that for physical processes, probability is in the territory. I think. Not too sure about this.

Comment by spxtr on Entropy and Temperature · 2014-12-19T00:29:13.323Z · score: 3 (3 votes) · LW · GW

In the standard LW map/territory distinction, probability, entropy, and temperature are all features of the map. Positions, momenta, and thus energy are features of the territory.

I understand that this doesn't fit your metaphysics, but I think it should still be a useful concept. Probably.

Comment by spxtr on Entropy and Temperature · 2014-12-18T23:12:28.249Z · score: 0 (0 votes) · LW · GW

No reason. Fixed.

Comment by spxtr on Entropy and Temperature · 2014-12-18T19:46:20.904Z · score: 1 (1 votes) · LW · GW

Depends on your background in physics. Landau & Lifshitz Statistical Mechanics is probably the best, but you won't get much out of it if you haven't taken some physics courses.

Comment by spxtr on Entropy and Temperature · 2014-12-18T06:15:29.225Z · score: 0 (0 votes) · LW · GW

Good show!

Comment by spxtr on Entropy and Temperature · 2014-12-18T04:37:11.901Z · score: 2 (2 votes) · LW · GW

I posted some plots in the comment tree rooted by DanielFilan. I don't know what you used as the equation for entropy, but your final answer isn't right. You're right that temperature should be intensive, but the second equation you wrote for it is still extensive, because E is extensive :p

Comment by spxtr on Entropy and Temperature · 2014-12-18T04:28:13.349Z · score: 7 (7 votes) · LW · GW

Maxwell's demon, as criticized in your first link, isn't omniscient. It has to observe incoming particles, and the claim is that this process generates the entropy.

Comment by spxtr on Entropy and Temperature · 2014-12-18T03:49:52.020Z · score: 3 (3 votes) · LW · GW

I made a plot of the entropy and the (correct) energy. Every feature of these plots should make sense.

Note that the exponential turn-on in E(T) is a common feature to any gapped material. Semiconductors do this too :)

Comment by spxtr on Entropy and Temperature · 2014-12-17T20:36:00.016Z · score: 1 (1 votes) · LW · GW

S(E) is the log of the number of states in phase space that are consistent with energy E. Having energy E means that E/ε particles are excited, so we get (N choose E/ε) states. Now take the log :)

Comment by spxtr on Entropy and Temperature · 2014-12-17T19:16:31.124Z · score: 4 (4 votes) · LW · GW

Not quite, but close. It should be a + instead of a - in the denominator. Nice work, though.

You have the right formula for the entropy. Notice that it is nearly identical to the Bernoulli distribution entropy. That should make sense: there is only one state with energy 0 or Nε, so the entropy should go to 0 at those limits. It's maximum is at Nε/2. Past that point, adding energy to the system actually decreases entropy. This leads to a negative temperature!

But we can't actually reach that by raising its temperature. As we raise temperature to infinity, energy caps at Nε/2 (specific heat goes to 0). To put more energy in, we have to actually find some particles that are switched off and switch them on. We can't just put it in equilibrium with a hotter thing.