Entropy and Temperature

post by spxtr · 2014-12-17T08:04:38.208Z · LW · GW · Legacy · 96 comments

Contents

  Entropy
  Temperature
  Second Law Trickery
  Homework
None
96 comments

Eliezer Yudkowsky previously wrote (6 years ago!) about the second law of thermodynamics. Many commenters were skeptical about the statement, "if you know the positions and momenta of every particle in a glass of water, it is at absolute zero temperature," because they don't know what temperature is. This is a common confusion.

Entropy

To specify the precise state of a classical system, you need to know its location in phase space. For a bunch of helium atoms whizzing around in a box, phase space is the position and momentum of each helium atom. For N atoms in the box, that means 6N numbers to completely specify the system.

Lets say you know the total energy of the gas, but nothing else. It will be the case that a fantastically huge number of points in phase space will be consistent with that energy.* In the absence of any more information it is correct to assign a uniform distribution to this region of phase space. The entropy of a uniform distribution is the logarithm of the number of points, so that's that. If you also know the volume, then the number of points in phase space consistent with both the energy and volume is necessarily smaller, so the entropy is smaller.

This might be confusing to chemists, since they memorized a formula for the entropy of an ideal gas, and it's ostensibly objective. Someone with perfect knowledge of the system will calculate the same number on the right side of that equation, but to them, that number isn't the entropy. It's the entropy of the gas if you know nothing more than energy, volume, and number of particles.

Temperature

The existence of temperature follows from the zeroth and second laws of thermodynamics: thermal equilibrium is transitive, and entropy is maximum in equilibrium. Temperature is then defined as the thermodynamic quantity that is the shared by systems in equilibrium.

If two systems are in equilibrium then they cannot increase entropy by flowing energy from one to the other. That means that if we flow a tiny bit of energy from one to the other (δU1 = -δU2), the entropy change in the first must be the opposite of the entropy change of the second (δS1 = -δS2), so that the total entropy (S1 + S2) doesn't change. For systems in equilibrium, this leads to (∂S1/∂U1) = (∂S2/∂U2). Define 1/T = (∂S/∂U), and we are done.

Temperature is sometimes taught as, "a measure of the average kinetic energy of the particles," because for an ideal gas U/= (3/2) kBT. This is wrong as a definition, for the same reason that the ideal gas entropy isn't the definition of entropy.

Probability is in the mind. Entropy is a function of probabilities, so entropy is in the mind. Temperature is a derivative of entropy, so temperature is in the mind.

Second Law Trickery

With perfect knowledge of a system, it is possible to extract all of its energy as work. EY states it clearly:

So (again ignoring quantum effects for the moment), if you know the states of all the molecules in a glass of hot water, it is cold in a genuinely thermodynamic sense: you can take electricity out of it and leave behind an ice cube.

Someone who doesn't know the state of the water will observe a violation of the second law. This is allowed. Let that sink in for a minute. Jaynes calls it second law trickery, and I can't explain it better than he does, so I won't try:

A physical system always has more macroscopic degrees of freedom beyond what we control or observe, and by manipulating them a trickster can always make us see an apparent violation of the second law.

Therefore the correct statement of the second law is not that an entropy decrease is impossible in principle, or even improbable; rather that it cannot be achieved reproducibly by manipulating the macrovariables {X1, ..., Xn} that we have chosen to define our macrostate. Any attempt to write a stronger law than this will put one at the mercy of a trickster, who can produce a violation of it.

But recognizing this should increase rather than decrease our con fidence in the future of the second law, because it means that if an experimenter ever sees an apparent violation, then instead of issuing a sensational announcement, it will be more prudent to search for that unobserved degree of freedom. That is, the connection of entropy with information works both ways; seeing an apparent decrease of entropy signi fies ignorance of what were the relevant macrovariables.

Homework

I've actually given you enough information on statistical mechanics to calculate an interesting system. Say you have N particles, each fixed in place to a lattice. Each particle can be in one of two states, with energies 0 and ε. Calculate and plot the entropy if you know the total energy: S(E), and then the energy as a function of temperature: E(T). This is essentially a combinatorics problem, and you may assume that N is large, so use Stirling's approximation. What you will discover should make sense using the correct definitions of entropy and temperature.


*: How many combinations of 1023 numbers between 0 and 10 add up to 5×1023?

96 comments

Comments sorted by top scores.

comment by Tyrrell_McAllister · 2014-12-17T18:51:14.463Z · LW(p) · GW(p)

This is a good article making a valuable point. But this —

Temperature is sometimes taught as, "a measure of the average kinetic energy of the particles," because for an ideal gas U/N = (3/2) kBT. This is wrong, for the same reason that the ideal gas entropy isn't the definition of entropy.

— is a confusing way to speak. There is such a thing as "the average kinetic energy of the particles", and one measure of this thing is called "temperature" in some contexts. There is nothing wrong with this as long as you are clear about what context you are in.

If you fall into the sun, your atoms will be strewn far and wide, and it won't be because of something "in the mind". There is a long and perfectly valid convention of calling the relevant feature of the sun its "temperature".

Replies from: B_For_Bandana, calef
comment by B_For_Bandana · 2014-12-17T20:33:07.831Z · LW(p) · GW(p)

An alternate phrasing (which I think makes it clearer) would be: "the distinction between mechanical and thermal energy is in the mind, and because we associate temperature with thermal but not mechanical energy, it follows that two observers of the same system can interpret it as having two different temperatures without inconsistency."

In other words, if you fall into the sun, your atoms will be strewn far and wide, yes, but your atoms will be equally strewn far and wide if you fall into an ice-cold mechanical woodchipper. The distinction between the types of energy used for the scattering process is what is subjective.

Replies from: Lumifer
comment by Lumifer · 2014-12-17T20:48:39.030Z · LW(p) · GW(p)

the distinction between mechanical and thermal energy is in the mind

The high-school definition of temperature as "a measure of the average kinetic energy of the particles" (see the grandparent comment) actually erases that distinction as it defines temperature through kinetic (mechanical) energy.

Replies from: B_For_Bandana, B_For_Bandana
comment by B_For_Bandana · 2014-12-17T21:28:06.137Z · LW(p) · GW(p)

I didn't read your comment carefully enough. Yes, we agree.

comment by B_For_Bandana · 2014-12-17T21:09:28.718Z · LW(p) · GW(p)

Right, but we don't think of a tennis ball falling in a vacuum as gaining thermal energy or rising in temperature. It is "only" gaining mechanical kinetic energy; a high school student would say that "this is not a thermal energy problem," even though the ball does have an average kinetic energy (kinetic energy, divided by 1 ball). But if temperature of something that we do think of as hot is just average kinetic energy, then there is a sense in which the entire universe is "not a thermal energy problem."

Replies from: Lumifer
comment by Lumifer · 2014-12-17T21:34:54.724Z · LW(p) · GW(p)

but we don't think of a tennis ball falling in a vacuum as gaining thermal energy or rising in temperature.

That's because temperature is a characteristic of a multi-particle system. One single particle has energy, a large set of many particles has temperature.

And still speaking of high-school physics, conversion between thermal and kinetic energy is trivially easy and happens all the time around us.

Replies from: jbay
comment by jbay · 2014-12-24T02:30:49.005Z · LW(p) · GW(p)

A tennis ball is a multi-particle system; however, all of the particles are accelerating more or less in unison while the ball free-falls. Nonetheless, it isn't usually considered to be increasing in temperature, because the entropy isn't increasing much as it falls.

comment by calef · 2014-12-17T23:51:38.806Z · LW(p) · GW(p)

I think more precisely, there is such a thing as "the average kinetic energy of the particles", and this agrees with the more general definition of temperature "1 / (derivative of entropy with respect to energy)" in very specific contexts.

That there is a more general definition of temperature which is always true is worth emphasizing.

Replies from: Luke_A_Somers
comment by Luke_A_Somers · 2014-12-18T17:35:22.919Z · LW(p) · GW(p)

Rather than 'in very specific contexts' I would say 'in any normal context'. Just because it's not universal doesn't mean it's not the overwhelmingly common case.

comment by shminux · 2014-12-18T00:34:52.080Z · LW(p) · GW(p)

if you know the states of all the molecules in a glass of hot water, it is cold in a genuinely thermodynamic sense: you can take electricity out of it and leave behind an ice cube.

I am not sure this is true as stated. An omniscient Maxwell demon that would only allow hot molecules out runs into a number of problems, and an experimentally constructed Maxwell's demon works by converting coherent light (low entropy) into incoherent (high entropy).

Replies from: spxtr
comment by spxtr · 2014-12-18T04:28:13.349Z · LW(p) · GW(p)

Maxwell's demon, as criticized in your first link, isn't omniscient. It has to observe incoming particles, and the claim is that this process generates the entropy.

comment by DanielFilan · 2014-12-17T10:46:22.367Z · LW(p) · GW(p)

[Spoiler alert: I can't find any 'spoiler' mode for comments, so I'm just going to give the answers here, after a break, so collapse the comment if you don't want to see that]

.

.

.

.

.

.

.

.

.

.

For the entropy (in natural units), I get

%20=%20N%20\ln%20N%20-%20\frac{E}{\epsilon}%20\ln%20\frac{E}{\epsilon}%20-%20\left(%20N%20-%20\frac{E}{\epsilon}%20\right)%20\ln%20\left(%20N%20-%20\frac{E}{\epsilon}%20\right))

and for the energy, I get

%20=%20\frac{\epsilon%20N}{e%5E{\epsilon%20/%20T}%20-%201})

Is this right? (upon reflection and upon consulting graphs, it seems right to me, but I don't trust my intuition for statistical mechanics)

Replies from: spxtr, spxtr, Falacer
comment by spxtr · 2014-12-17T19:16:31.124Z · LW(p) · GW(p)

Not quite, but close. It should be a + instead of a - in the denominator. Nice work, though.

You have the right formula for the entropy. Notice that it is nearly identical to the Bernoulli distribution entropy. That should make sense: there is only one state with energy 0 or Nε, so the entropy should go to 0 at those limits. It's maximum is at Nε/2. Past that point, adding energy to the system actually decreases entropy. This leads to a negative temperature!

But we can't actually reach that by raising its temperature. As we raise temperature to infinity, energy caps at Nε/2 (specific heat goes to 0). To put more energy in, we have to actually find some particles that are switched off and switch them on. We can't just put it in equilibrium with a hotter thing.

comment by spxtr · 2014-12-18T03:49:52.020Z · LW(p) · GW(p)

I made a plot of the entropy and the (correct) energy. Every feature of these plots should make sense.

Note that the exponential turn-on in E(T) is a common feature to any gapped material. Semiconductors do this too :)

Replies from: Luke_A_Somers, DanielFilan
comment by Luke_A_Somers · 2014-12-18T17:39:56.320Z · LW(p) · GW(p)

Why did you only show the E(T) function for positive temperatures?

Replies from: calef, spxtr
comment by calef · 2014-12-18T20:31:04.267Z · LW(p) · GW(p)

This is a good point. The negative side gives good intuition for the "negative temperatures are hotter than any positive temperature" argument.

Replies from: Luke_A_Somers
comment by Luke_A_Somers · 2014-12-22T03:41:59.054Z · LW(p) · GW(p)

What gives a better intuition is thinking in inverse temperature.

Regular temperature is, 'how weakly is this thing trying to grab more energy so as to increase its entropy'.

Inverse temperature is 'how strongly...' and when that gets down to 0, it's natural to see it continue on into negatives, where it's trying to shed energy to increase its entropy.

comment by spxtr · 2014-12-18T23:12:28.249Z · LW(p) · GW(p)

No reason. Fixed.

comment by DanielFilan · 2014-12-18T11:05:17.233Z · LW(p) · GW(p)

The energy/entropy plot makes total sense, the energy/temperature doesn't really because I don't have a good feel for what temperature actually is, even after reading the "Temperature" section of your argument (it previously made sense because Mathematica was only showing me the linear-like part of the graph). Can you recommend a good text to improve my intuition? Bonus points if this recommendation arrives in the next 9.5 hours, because then I can get the book from my university library.

Replies from: spxtr
comment by spxtr · 2014-12-18T19:46:20.904Z · LW(p) · GW(p)

Depends on your background in physics. Landau & Lifshitz Statistical Mechanics is probably the best, but you won't get much out of it if you haven't taken some physics courses.

comment by Falacer · 2014-12-17T20:00:42.250Z · LW(p) · GW(p)

I gave this a shot as well as since your value for E(T) → ∞ as T → ∞, while I would think the system should cap out at εN.

I get a different value for S(E), reasoning:

If E/ε is 1, there are N microstates, since 1 of N positions is at energy ε. If E/ε is 2, there are N(N-1) microstates. etc. etc, giving for E/ε = x that there are N!/(N-x)!

so S = ln [N!/(N-x)!] = ln(N!) - ln((N-x)!) = NlnN - (N-x)ln(N-x)

S(E) = N ln N - (N - E/ε) ln (N - E/ε)

Can you explain how you got your equation for the entropy?

Going on I get E(T) = ε(N - e^(ε/T - 1) )

This also looks wrong, as although E → ∞ as T → ∞, it also doesn't cap at exactly εN, and E → -∞ for T→ 0...

I'm expecting the answer to look something like: E(T) = εN(1 - e^(-ε/T))/2 which ranges from 0 to εN/2, which seems sensible.

EDIT: Nevermind, the answer was posted while I was writing this. I'd still like to know how you got your S(E) though.

Replies from: spxtr
comment by spxtr · 2014-12-17T20:36:00.016Z · LW(p) · GW(p)

S(E) is the log of the number of states in phase space that are consistent with energy E. Having energy E means that E/ε particles are excited, so we get (N choose E/ε) states. Now take the log :)

comment by jacob_cannell · 2014-12-22T05:22:58.204Z · LW(p) · GW(p)

This is related to the physics of computation: ultimate physical computers coincides with temperatures approaching need to 0 K - (reversible computing, Landauer Principle) . Heat/entropy is computational stupidity.

Incidentally, this also explains the fermi paradox: post singularity civilizations migrate out away from their hot stars into the cold interstellar spaces, becoming dark matter. (which however, does not imply that all cold dark matter is intelligent)

comment by Richard Korzekwa (Grothor) · 2014-12-19T00:16:50.509Z · LW(p) · GW(p)

Temperature is then defined as the thermodynamic quantity that is the shared by systems in equilibrium.

I think I've figured out what's bothering me about this. If we think of temperature in terms of our uncertainty about where the system is in phase space, rather than how large a region of phase space fits the macroscopic state, then we gain a little in using the second law, but give up a lot everywhere else. Unless I am mistaken, we lose the following:

  • Heat flows from hot to cold
  • Momentum distribution can be predicted from temperature
  • Phase changes can be predicted from temperature
  • The reading on a thermometer can be predicted from temperature

I'm sure there are others. I realize that if we know the full microscopic state of a system, then we don't need to use temperature for these things, but then we wouldn't need to use temperature at all.

if you know the states of all the molecules in a glass of hot water, it is cold in a genuinely thermodynamic sense: you can take electricity out of it and leave behind an ice cube.

If you're able to do this, I don't see why you'd be using temperature at all, unless you want to talk about how hot the water is to begin with (as you did), in which case you're referring to the temperature that the water would be if we had no microscopic information.

Replies from: spxtr
comment by spxtr · 2014-12-19T07:09:37.621Z · LW(p) · GW(p)

We don't lose those things. Remember, this isn't my definition. This is the actual definition of temperature used by statistical physicists. Anything statistical physics predicts (all of the things you listed) is predicted by this definition.

You're right though. If you know the state of the molecules in the water then you don't need to think about temperature. That's a feature, not a bug.

Replies from: Grothor
comment by Richard Korzekwa (Grothor) · 2014-12-19T19:29:57.213Z · LW(p) · GW(p)

We don't lose those things.

Suppose that you boil some water in a pot. You take the pot off the stove, and then take a can of beer out of the cooler (which is filled with ice) and put it in the water. The place where you're confusing your friends by putting cans of beer in pots of hot water is by the ocean, so when you read the thermometer that's in the water, it reads 373 K. The can of beer, which was in equilibrium with the ice at a measured 273 K, had some bits of ice stuck to it when you put it in. They melt. Next, you pull out your fancy laser-doppler-shift-based water molecule momentum spread measurer. The result jives with 373 K liquid water. After a short time, you read the thermometer as 360 K (the control pot with no beer reads 371 K). There is no ice left in the pot. You take out the beer, open it, and measure it's temperature to be 293 K and its momentum width to be smaller than that of the boiling water.

What we observed was:

  • Heat flowed from 373 K water to 273 K beer
  • The momentum distribution is wider for water at 373 K than at 293 K
  • Ice placed in 373 K water melts
  • Our thermometer reads 373 K for boiling water and 273 K for water-ice equilibrium

Now, suppose we do exactly the same thing, but just after putting the beer in the water, Omega tells us the state of every water molecule in the pot, but not the beer. Now we know the temperature of the water is exactly 0 K. We still anticipate the same outcome (perhaps more precisely), and observe the same outcome for all of our measurements, but we describe it differently:

  • Heat flowed from 0 K water to 273 K beer
  • The momentum distribution is wider for water at 0 K (or recently at 0 K) than at 293 K
  • Ice placed in 0 K water melts
  • Our thermometer reads 373 K for water boiling at 0 K, and 273 K for water-ice equilibrium

So the only difference is in the map, not the territory, and it seems to be only in how we're labeling the map, since we anticipate the same outcome using the same model (assuming you didn't use the specific molecular states in your prediction).

Remember, this isn't my definition. This is the actual definition of temperature used by statistical physicists.

I agree that temperature should be defined so that 1/T = dS/dE . This is the definition that, as far as I can tell, all physicists use. But nearly every result that uses temperature is derived using the assumption that all microstates are equally probable (your second law example being the only exception that I am aware of). In fact, this is often given as a fundamental assumption of statistical mechanics, and I think this is what makes the "glass of water at absolute zero" comment confusing. (Moreover, many physicists, such as plasma physicists, will often say that the temperature is not well-defined unless certain statistical conditions are met, like the energy and momentum distributions having the correct form, or the system being locally in thermal equilibrium with itself.)

I'm having trouble with brevity here, but what I'm getting at is that if you want to show that we can drop the fundamental postulate of statistical mechanics, and still recover the second law of thermodynamics, then I'm, happy to call it a feature rather than a bug. But it seems like bringing in temperature confuses the issue rather than clarifying it.

Replies from: spxtr, spxtr
comment by spxtr · 2014-12-19T22:35:49.428Z · LW(p) · GW(p)

Omega tells us the state of the water at time T=0, when we put the beer into it. There are two ways of looking at what happens immediately after.

The first way is that the water doesn't flow heat into the beer, rather it does some work on it. If we know the state of the beer/water interface as well then we can calculate exactly what will happen. It will look like quick water molecules thumping into slow boundary molecules and doing work on them. This is why the concept of temperature is no longer necessary: if we know everything then we can just do mechanics. Unfortunately, we don't know everything about the full system, so this won't quite work.

Think about your uncertainty about the state of the water as you run time forward. It's initially zero, but the water is in contact with something that could be in any number of states (the beer), and so the entropy of the water is going to rise extremely quickly.

The water will initially be doing work on the beer, but after an extremely short time it will be flowing heat into it. One observer's work is another's heat, essentially.

Replies from: Grothor
comment by Richard Korzekwa (Grothor) · 2014-12-20T22:52:13.951Z · LW(p) · GW(p)

The first way is that the water doesn't flow heat into the beer, rather it does some work on it.

This actually clears things up quite a lot. I think my discomfort with this description is mainly aesthetic. Thank you for being patient.

comment by spxtr · 2014-12-19T22:37:04.460Z · LW(p) · GW(p)

The rule that all microstates that are consistent with a given macrostate are equally probable is a consequence of the maximum entropy principle. See this Jaynes paper.

comment by shminux · 2014-12-18T22:39:01.286Z · LW(p) · GW(p)

Probability is in the mind.

I am not sure what this means. In what sense is probability in the mind, but energy isn't? Or if energy is in the mind, as well, what physical characteristic is not and why?

Replies from: spxtr, TheAncientGeek
comment by spxtr · 2014-12-19T00:29:13.323Z · LW(p) · GW(p)

In the standard LW map/territory distinction, probability, entropy, and temperature are all features of the map. Positions, momenta, and thus energy are features of the territory.

I understand that this doesn't fit your metaphysics, but I think it should still be a useful concept. Probably.

Replies from: shminux
comment by shminux · 2014-12-19T01:01:03.244Z · LW(p) · GW(p)

Sorry, I wasn't clear. I didn't use "my metaphysics" here, just the standard physical realism, with maps and territories. Suppose energy is the feature of the territory... because it's the "capacity to do work", using the freshman definition. Why would you not define temperature as the capacity to transfer heat, or something? And probability is already defined as the rate of decay in many cases... or is that one in the mind, too?

Replies from: spxtr
comment by spxtr · 2014-12-19T04:12:54.325Z · LW(p) · GW(p)

Energy is a feature of the territory because it's a function of position and momenta and other territory-things.

"Capacity to transfer heat" can mean a few things, and I'm not sure which you want. There's already heat capacity, which is how much actual energy is stored per degree of temperature. To find the total internal heat energy you just have to integrate this up to the current temperature. The usual example here is an iceberg which stores much more heat energy than a cup of coffee, and yet heat flows from the coffee to the iceberg. If you mean something more like "quantity that determines which direction heat will flow between two systems," then that's just the definition I presented :p

I actually have trouble defending "probability is in the mind" in some physics contexts without invoking many-worlds. If it turns out that many-worlds is wrong and copenhagen, say, is right, then it will be useful to believe that for physical processes, probability is in the territory. I think. Not too sure about this.

Replies from: shminux, TheAncientGeek
comment by shminux · 2014-12-19T06:06:01.907Z · LW(p) · GW(p)

"quantity that determines which direction heat will flow between two systems,"

Yeah, that's a better definition :)

I actually have trouble defending "probability is in the mind" in some physics contexts without invoking many-worlds.

Feel free to elaborate. I'd think that probability is either in the map or in the territory, regardless of the context or your QM ontology, not sometimes here and sometimes there. And if it is in the territory, then so is entropy and temperature, right?

Replies from: spxtr, spxtr, TheAncientGeek
comment by spxtr · 2014-12-19T07:23:29.152Z · LW(p) · GW(p)

Say we have some electron in equal superposition of spin up and down, and we measure it.

In Copenhagen, the universe decides that it's up or down right then and there, with 50% probability. This isn't in the mind, since it's not a product of our ignorance. It can't be, because of Bell stuff.

In many-worlds, the universe does a deterministic thing and one branch measures spin up, one measures spin down. The probability is in my mind, because it's a product of my ignorance - I don't know what branch I'm in.

Replies from: shminux
comment by shminux · 2014-12-19T07:34:52.043Z · LW(p) · GW(p)

Hmm, so, assuming there is no experimental distinction between the two interpretations, there is no way to tell the difference between map and territory, not even in principle? That's disconcerting. I guess I see what you mean by " trouble defending "probability is in the mind"".

Replies from: spxtr, spxtr
comment by spxtr · 2014-12-23T07:42:57.709Z · LW(p) · GW(p)

If we inject some air into a box then close our eyes for a few seconds and shove in a partition, there is a finite chance of finding the nitrogen on one side and the oxygen on the other. Entropy can decrease, and that's allowed by the laws of physics. The internal energy had better not change, though. That's disallowed.

If energy changes, our underlying physical laws need to be reexamined. In the official dogma, these are firmly in the territory. If entropy goes down, our map was likely wrong.

comment by spxtr · 2014-12-19T08:46:51.362Z · LW(p) · GW(p)

That doesn't follow.

comment by spxtr · 2014-12-23T07:45:45.737Z · LW(p) · GW(p)

I'd think that probability is either in the map or in the territory

Even if Copenhagen is right, I as a rational agent should still ought to use mind-probabilities. It may be the case that the quantum world is truly probabilistic-in-the-territory, but that doesn't affect the fact that I don't know the state of any physical system precisely.

comment by TheAncientGeek · 2015-01-19T19:56:45.791Z · LW(p) · GW(p)

Can't there be forms of probability in the territoryand the map?

comment by TheAncientGeek · 2015-01-19T19:55:02.339Z · LW(p) · GW(p)

You're most of the way towards why you shouldn't believe the Jaynes -Yudkowaky argument.

If you really can infer the absence of probability in the territory by reflecting on human reasoning alone, then the truth ofCLI versus MWI shouldn't matter. If matters, as you seem to think, then armchair reasoning can't do what Jaynes amd Yudkowsky think it can (in this case)

comment by TheAncientGeek · 2015-01-19T19:31:44.864Z · LW(p) · GW(p)

Its a reference to a bad,but locally popular argument from Jaynes. It holds, that since since, some forms of probability are subjective, they all are, and...ta-daaa ....the territory is therefore deterministic.

Replies from: dxu
comment by dxu · 2015-01-25T23:32:48.417Z · LW(p) · GW(p)

Name a probability that is not subjective. (And before you bring up quantum-mechanical collapse, I'd just like to say one thing: MWI. And before you complain about unfalsifiability, let me link you here.)

Replies from: TheAncientGeek
comment by TheAncientGeek · 2015-01-26T16:36:31.314Z · LW(p) · GW(p)

I don't need definite proof of in-the-territory probability to support my actual point, which is that you can't determine the existence or non existence of features of the territory by armchair reflection.

Replies from: dxu
comment by dxu · 2015-01-26T20:15:08.862Z · LW(p) · GW(p)

Of course you can't determine whether something exists or not. There might yet be other probabilities out there that actually are objective. The fact that we have not discovered any such thing, however, is telling. Absence of evidence is evidence of absence. Therefore, it is likely--not certain, but likely--that no such probabilities exist. If your claim is that we cannot be certain of this, then of course you are correct. Such a claim, however, is trivial.

comment by torekp · 2014-12-18T01:59:20.195Z · LW(p) · GW(p)

Thanks for this. I am definitely going to use the Gibbs paradox (page 3 of the Jaynes paper) to nerd-snipe my physics-literate friends.

comment by Richard Korzekwa (Grothor) · 2014-12-18T03:54:51.199Z · LW(p) · GW(p)

I'l follow suit with the previous spoiler warning.

SPOILER ALERT .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

I took a bit different approach from the others that have solved this, or maybe you'd just say I quit early once I thought I'd shown the thing I thought you were trying to show:

If we write entropy in terms of the number of particles, N and the fraction of them that are excited: α ≡ E/(Nε) , and take the derivative with respect to α, we get:

dS/dα = N log [(1-α)/α]

Or if that N is bothering you (since temperature is usually an intensive property), we can just write:

T = 1/(dS/dE) = E / log[(1-α)/α]

This will give us zero temperature for all excited or no excited particles (which makes sense, because you know exactly where you are in phase space), and it blows up at half particles are excited. This means that there is no reservoir hot enough to get from α < .5 to α = .5 .

Replies from: spxtr
comment by spxtr · 2014-12-18T04:37:11.901Z · LW(p) · GW(p)

I posted some plots in the comment tree rooted by DanielFilan. I don't know what you used as the equation for entropy, but your final answer isn't right. You're right that temperature should be intensive, but the second equation you wrote for it is still extensive, because E is extensive :p

Replies from: Grothor
comment by Richard Korzekwa (Grothor) · 2014-12-18T05:38:47.751Z · LW(p) · GW(p)

your final answer isn't right

You're right. That should be ε, not E. I did the extra few steps to substitute α = E/(Nε) back in, and solve for E, to recover DanielFilan's (corrected) result:

E = Nε / (exp(ε/T) + 1)

I used S = log[N choose M], where M is the number of excited particles (so M = αN). Then I used Stirling's approximation as you suggested, and differentiated with respect to α.

Replies from: spxtr
comment by spxtr · 2014-12-18T06:15:29.225Z · LW(p) · GW(p)

Good show!

comment by Lumifer · 2014-12-17T19:21:12.295Z · LW(p) · GW(p)

so temperature is in the mind

I am not quite sure in which way this statement is useful.

"..and for an encore goes on to prove that black is white and gets himself killed on the next zebra crossing." -- Douglas Adams

Replies from: DanielFilan, DanielLC, nshepperd
comment by DanielFilan · 2014-12-18T00:50:15.019Z · LW(p) · GW(p)

I had that thought as well, but the 'Second Law Trickery' section convinced me that it was a useful statement.

Replies from: Lumifer
comment by Lumifer · 2014-12-18T02:41:01.264Z · LW(p) · GW(p)

I'll grant that it is an interesting statement, but at the moment my impression is that it's just redefining the word "temperature" in a particular way.

comment by DanielLC · 2014-12-17T20:02:39.407Z · LW(p) · GW(p)

I don't know of any way that statement in particular is useful, but understanding the model that produces it can be helpful. For example, it's possible to calculate the minimum amount of energy necessary to run a certain computation on a computer at a certain temperature. It's further useful in that it shows that if the computation is reversible, there is no minimum energy.

Replies from: Lumifer
comment by Lumifer · 2014-12-17T20:12:46.509Z · LW(p) · GW(p)

understanding the model that produces it

The model is fine, what I'm having problems with is the whole "in the mind" business which goes straight to philosophy and seems completely unnecessary for the discussion of properties of classic systems in physics.

Replies from: DanielLC, buybuydandavis
comment by DanielLC · 2014-12-18T00:20:29.275Z · LW(p) · GW(p)

Entropy is statistical laws. Thus, like statistics, it's in the mind. It's also no more philosophical than statistics is, and not psychological at all.

Replies from: Lumifer
comment by Lumifer · 2014-12-18T02:39:57.720Z · LW(p) · GW(p)

Entropy is statistical laws. Thus, like statistics, it's in the mind.

I have a feeling you're confusing the map and the territory. Just because statistics (defined as a toolbox of methods for dealing with uncertainty) exists in the mind, there is no implication that uncertainty exists only in the mind as well. Half-life of a radioactive element is a statistical "thing" that exists in real life, not in the mind.

In the same way, phase changes of a material exist in the territory. You can usefully define temperature as a particular metric such that water turns into gas at 100 and turns into ice at zero. Granted, this approach has its limits but it does not seem to depend on being "in the mind".

Replies from: DanielLC
comment by DanielLC · 2014-12-18T04:23:53.836Z · LW(p) · GW(p)

The half-life of a radioactive element is something that can be found without using probability. It is the time it takes for the measure of the universes in which the atom is still whole to be exactly half of the initial measure. Similarly, phase change can be defined without using probability.

The universe may be indeterministic (though I don't think it is), but all this means is that the past is not sufficient to conclude the future. A mind that already knows the future (perhaps because it exists further in the future) would still know the future.

Replies from: Lumifer
comment by Lumifer · 2014-12-18T05:54:02.548Z · LW(p) · GW(p)

the time it takes for the measure of the universes

So, does your probability-less half-life require MWI? That's not a good start. What happens if you are unwilling to just assume MWI?

A mind that already knows the future

Why do you think such a thing is possible?

Replies from: Kindly, DanielLC
comment by Kindly · 2014-12-18T20:18:44.660Z · LW(p) · GW(p)

Even without references to MWI, I'm pretty sure you can just say the following: if at time t=0 you have an atom of carbon-14, at a later time t>0 you will have a superposition of carbon-14 and nitrogen-14 (with some extra stuff). The half-life is the value of t for which the two coefficients will be equal in absolute value.

comment by DanielLC · 2014-12-18T06:44:16.702Z · LW(p) · GW(p)

Uncertainty in the mind and uncertainty in the territory are related, but they're not the same thing, and calling them both "uncertainty" is misleading. If indeterminism is true, there is an upper limit to how certain someone can reliably be about the future, but someone further in the future can know it with perfect certainty and reliability.

If I ask if the billionth digit of pi is even or odd, most people would give even odds to those two things. But it's something that you'd give even odds to on a bet, even in a deterministic universe.

If I flip a coin and it lands on heads, you'd be a fool to bet otherwise. It doesn't matter if the universe is nondeterministic and you can prove that, given all the knowledge of the universe before the coin was flipped, it would be exactly equally likely to land on heads or tails. You know it landed on heads. It's 100% certain.

Replies from: Lumifer
comment by Lumifer · 2014-12-18T17:05:20.776Z · LW(p) · GW(p)

Yes, future is uncertain but past is already fixed and certain. So? We are not talking about probabilities of something happening in the past. The topic of the discussion is how temperature (and/or probabilities) are "in the mind" and what does that mean.

Replies from: DanielLC
comment by DanielLC · 2014-12-18T19:01:51.283Z · LW(p) · GW(p)

The past is certain but the future is not. But the only difference between the two is when you are in relation to them. It's not as if certain time periods are inherently past or future.

An example of temperature being in the mind that's theoretically possible to set up but you'd never manage in practice is Maxwell's demon. If you already know where all of the particles of gas are and how they're bouncing, you could make it so all the fast ones end up in one chamber and all the slow ones end up in the other. Or you can just get all of the molecules into the same chamber. You can do this with an arbitrarily small amount of energy.

comment by buybuydandavis · 2014-12-17T21:31:27.742Z · LW(p) · GW(p)

I think his "in the mind" is correct in his context, because in the model of entropy he is discussing, temperature_entropy is dependent on entropy, is dependent on your knowledge of the states of the system.

I'll repeat what I said earlier in the context of the discussion of different theories of time.

Me, I think the people who identify exists_everydaymode with exists_spacetimemodel are just conceptually confused by their high falutin ideas. Exists_everydaymode didn't cease to exist when we got our fancy new spacetime model to play with, and it's relevance and functionality didn't cease to exist either. "I have cancer" is really distinguishable in important ways to us from "I had cancer."

New physics didn't make old ideas useless. Temperature_kineticenergy is probably more relevant in most situations.

because they don't know what temperature is

The OP makes his mistake by identifying temperature_entropy with temperature_kineticenergy.

Replies from: calef, Lumifer, DanielLC
comment by calef · 2014-12-17T23:46:26.734Z · LW(p) · GW(p)

I'm don't see the issue in saying [you don't know what temperature really is] to someone working with the definition [T = average kinetic energy]. One definition of temperature is always true. The other is only true for idealized objects.

Replies from: buybuydandavis, DanielLC
comment by buybuydandavis · 2014-12-19T02:29:46.455Z · LW(p) · GW(p)

Nobody knows what anything really is. We have more or less accurate models.

comment by DanielLC · 2014-12-18T00:32:33.697Z · LW(p) · GW(p)

What do you mean by "true"? They both can be expressed for any object. They are both equal for idealized objects.

Replies from: calef
comment by calef · 2014-12-18T03:18:03.471Z · LW(p) · GW(p)

Only one of them actually corresponds with temperature for all objects. They are both equal for one subclass of idealized objects, in which case the "average kinetic energy" definition follows from the the entropic definition, not the other way around. All I'm saying is that it's worth emphasizing that one definition is strictly more general than the other.

Replies from: DanielLC
comment by DanielLC · 2014-12-18T04:18:47.517Z · LW(p) · GW(p)

Average kinetic energy always corresponds to average kinetic energy, and the amount of energy it takes to create a marginal amount of entropy always corresponds to the amount of energy it takes to create a marginal amount of entropy. Each definition corresponds perfectly to itself all of the time, and applies to the other in the case of idealized objects. How is one more general?

Replies from: nshepperd, calef
comment by nshepperd · 2014-12-18T06:15:54.183Z · LW(p) · GW(p)

Two systems with the same "average kinetic energy" are not necessarily in equilibrium. Sometimes energy flows from a system with lower average kinetic energy to a system with higher average kinetic energy (eg. real gases with different degrees of freedom). Additionally "average kinetic energy" is not applicable at all to some systems, eg. ising magnet.

comment by calef · 2014-12-18T05:06:13.935Z · LW(p) · GW(p)

I just mean as definitions of temperature. There's temperature(from kinetic energy) and temperature(from entropy). Temperature(from entropy) is a fundamental definition of temperature. Temperature(from kinetic energy) only tells you the actual temperature in certain circumstances.

Replies from: DanielLC
comment by DanielLC · 2014-12-18T05:50:25.471Z · LW(p) · GW(p)

Why is one definition more fundamental than another? Why is only one definition "actual"?

Replies from: calef
comment by calef · 2014-12-18T08:17:19.702Z · LW(p) · GW(p)

Because one is true in all circumstances and the other isn't? What are you actually objecting to? That physical theories can be more fundamental than each other?

Replies from: DanielLC
comment by DanielLC · 2014-12-18T19:08:48.264Z · LW(p) · GW(p)

I admit that some definitions can be better than others. A whale lives underwater, but that's about the only thing it has in common with a fish, and it has everything else in common with a whale. You could still make a word to mean "animal that lives underwater". There are cases where where it lives is so important that that alone is sufficient to make a word for it. If you met someone who used the word "fish" to mean "animal that lives underwater", and used it in contexts where it was clear what it meant (like among other people who also used it that way), you might be able to convince them to change their definition, but you'd need a better argument than "my definition is always true, whereas yours is only true in the special case that the fish is not a mammal".

Replies from: calef
comment by calef · 2014-12-18T20:28:16.915Z · LW(p) · GW(p)

The distinction here goes deeper than calling a whale a fish (I do agree with the content of the linked essay).

If a layperson asks me what temperature is, I'll say something like, "It has to do with how energetic something is" or even "something's tendency to burn you". But I would never say "It's the average kinetic energy of the translational degrees of freedom of the system" because they don't know what most of those words mean. That latter definition is almost always used in the context of, essentially, undergraduate problem sets as a convenient fiction for approximating the real temperature of monatomic ideal gases--which, again, is usually a stepping stone to the thermodynamic definition of temperature as a partial derivative of entropy.

Alternatively, we could just have temperature(lay person) and temperature(precise). I will always insist on temperature(precise) being the entropic definition. And I have no problem with people choosing whatever definition they want for temperature(lay person) if it helps someone's intuition along.

comment by Lumifer · 2014-12-18T02:33:15.615Z · LW(p) · GW(p)

So, effectively there are two different things which go by the same name? Temperature_entropy is one measure (coming from the information-theoretic side) and temperature_kineticenergy is another measure (coming from, um, pre-Hamiltonian mechanics?)..?

That makes some sense, but then I have a question. If you take an ice cube out of the freezer and put it on a kitchen counter, will it melt if there is no one to watch it? In other words, how does the "temperature is in the mind" approach deal with phase transitions?

Replies from: buybuydandavis
comment by buybuydandavis · 2014-12-19T02:28:42.914Z · LW(p) · GW(p)

So, effectively there are two different things which go by the same name?

They look like two different concepts to me.

In other words, how does the "temperature is in the mind" approach deal with phase transitions?

I don't know. I suppose that would depend on how much that mind knows about phase transitions.

comment by DanielLC · 2014-12-18T00:30:27.729Z · LW(p) · GW(p)

Temperature_kineticenergy is probably more relevant in most situations.

That's difficult to say. If you build a heat pump, you deal with entropy. If you radiate waste heat, you deal with kinetic energy. If you want to know how much waste heat you're going to have, you deal with entropy. If you significantly change the temperature of something with a heat pump, then you have to deal with both for a large variety of temperatures.

Calling them Temperature_kineticenergy and Temperature_entropy is somewhat misleading, since both involve kinetic energy. Temperature_kineticenergy is average kinetic energy, and Temperature_entropy is the change in kinetic energy necessary to cause a marginal increase in entropy.

Also, if you escape your underscores with backslashes, you won't get the italics.

comment by nshepperd · 2014-12-18T06:23:09.460Z · LW(p) · GW(p)

I am not quite sure in which way this statement is useful.

Is that because you didn't read the rest of the post?

"Temperature is in the mind" doesn't mean that you can make a cup of water boil just by wishing hard enough. It means that whether or not you should expect a cup of water to boil depends on what you know about it.

(It also doesn't mean that whether an ice cube melts depends on whether anyone's watching. The ice cube does whatever the ice cube does in accordance with its initial conditions and the laws of mechanics.)

Replies from: Lumifer
comment by Lumifer · 2014-12-18T16:56:13.401Z · LW(p) · GW(p)

So now that you've told me what it does NOT mean, perhaps you can clarify what it DOES mean? I still don't understand.

In particular, the phrase "in the mind" implies that temperature requires a mind and would not exist if there were no minds around. Given that we are talking about classical systems, this seems an unusual position to take.

Another implication of "in the mind" is that different minds would see temperature differently. In fact, if you look into the original EY post, it explicitly says

Is the water colder, because we know more about it? Ignoring quantumness for the moment, the answer is: Yes! Yes it is!

And that makes me curious about phase changes. Can I freeze water into ice by knowing more about it? Note: not by doing things like separating molecules by energy and ending up with ice and electricity, but purely by knowing?

comment by Richard_Kennaway · 2015-01-19T10:22:29.554Z · LW(p) · GW(p)

Probability is in the mind. Entropy is a function of probabilities, so entropy is in the mind. Temperature is a derivative of entropy, so temperature is in the mind.

If I plunge my hand into boiling water, I will get scalded. Will I still get scalded if I know the position and momentum of every particle involved? If so, what causes it? If not, where does this stop -- is everything in the mind?

ETA: I should have reread the discussion first, because there has been a substantial amount about this very question. However, I'm not sure it has come to a conclusion that resolves the question. Also, no-one has taken on Shalizi's conundrum that someone cited, that Bayesian reasoners should see entropy decrease with time.

ETA2: One response would be that the detailed knowledge allows one to predict the same injury, by predicting the detailed properties of all of the particles through time. But I find this unsatisfying, because the easiest way to get that prediction is to start by throwing away almost all of the information you started with. Find the temperature you would attribute to this microstate if you didn't know the microstate, and make the prediction you would have made knowing only the temperature. This will almost invariably give the right answer: it will be right as often as you would actually get scalded from boiling water. If the simplest way to make some objective prediction is in terms of a supposedly subjective quantity that is not being experienced by any actual subject, just how subjective is that quantity?

ETA 3: Consider the configuration space of the whole system, a manifold in some gigantic number of dimensions, somewhat more than Avogadro's number. I am guessing that with respect to any sensible measure on that manifold, almost all of it is in states whose temporal evolution almost exactly satisfies equipartition. Equipartition gives you an objective definition of temperature.

Equipartition is what you will see virtually all of the time, when you do not know the microstate. But it is also what you will see when you do know the microstate, for virtually every microstate. The only way to come upon a non-equipartitioned pan of hot water is by specifically preparing it in such a state.

But what is a sensible measure, if we do not deliberately define it in such a way as to make the above true? You could invent a measure that gave most of its mass to the non-equipartitioned cases. But to do that you would have to already know that you wanted to do that, in order to devote the mass to them. I think there's a connection here to the matter of (alleged) "Bayesian brittleness", and to the game of "follow the improbability". But I do not quite see a resolution of this point yet. Entsophy/Joseph Wilson seemed to be working towards this on his blog last year, until he had some sort of meltdown and vanished from the net. I had intended to ask him how he would define measures on continuous manifolds (he had up to then only considered the discrete case and promised he would get to continuous ones), but I never did.

Replies from: dxu
comment by dxu · 2015-01-21T06:33:41.142Z · LW(p) · GW(p)

If I plunge my hand into boiling water, I will get scalded. Will I still get scalded if I know the position and momentum of every particle involved? If so, what causes it? If not, where does this stop -- is everything in the mind?

Assuming that you plunge your hand into the water at a random point in time, yes you will get scalded with probability ~1. This means that the water is "hot" in the same sense that the lottery is "fair" even if you know what the winning numbers will be--if you don't use that winning knowledge and instead just pick a series of random numbers, as you would if you didn't know the winning numbers, then of course you will still lose. I suppose if you are willing to call such a lottery "fair", then by that same criterion, the water is hot. However, if you use this criterion, I suspect a large number of people would disagree with you on what exactly it means for a lottery to be "fair". If, on the other hand, you would call a lottery in which you know the winning numbers "unfair", you should be equally willing to call water about which you know everything "cold".

Replies from: Lumifer, Richard_Kennaway
comment by Lumifer · 2015-01-21T15:55:31.867Z · LW(p) · GW(p)

This means that the water is "hot" in the same sense that the lottery is "fair"

Well, if I know the winning numbers but Alice doesn't, the lottery is "fair" for Alice. If I know everything about that cup of water, but Alice doesn't, is the water at zero Kelvin for me but still hot for Alice?

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2015-01-21T16:09:13.411Z · LW(p) · GW(p)

If I know everything about that cup of water, but Alice doesn't, is the water at zero Kelvin for me but still hot for Alice?

And will we both predict the same result when someone puts their hand in it?

Replies from: Lumifer
comment by Lumifer · 2015-01-21T16:37:59.287Z · LW(p) · GW(p)

Probably yes, but then I will have to say things like "Be careful about dipping your finger into that zero-Kelvin block of ice, it will scald you" X-)

Replies from: spxtr
comment by spxtr · 2015-01-22T04:07:26.470Z · LW(p) · GW(p)

It won't be ice. Ice has a regular crystal structure, and if you know the microstate you know that the water molecules aren't in that structure.

Replies from: Lumifer
comment by Lumifer · 2015-01-22T16:03:39.736Z · LW(p) · GW(p)

So then temperature has nothing to do with phase changes?

Replies from: gjm
comment by gjm · 2015-01-22T16:43:37.394Z · LW(p) · GW(p)

Temperature in the thermodynamic sense (which is the same as the information-theoretic sense if you have only ordinary macroscopic information) is the same as average energy per molecule, which has a lot to do with phase changes for the obvious reason.

In exotic cases where the information-theoretic and thermodynamic temperatures diverge, thermodynamic temperature still tells you about phase changes but information-theoretic temperature doesn't. (The thermodynamic temperature is still useful in these cases; I hope no one is claiming otherwise.)

Replies from: spxtr, Lumifer
comment by spxtr · 2015-01-22T20:00:24.680Z · LW(p) · GW(p)

You probably know this, but average energy per molecule is not temperature at low temperatures. Quantum kicks in and that definition fails. dS/dE never lets you down.

Replies from: gjm
comment by gjm · 2015-01-22T22:05:33.360Z · LW(p) · GW(p)

Whoops! Thanks for the correction.

comment by Lumifer · 2015-01-22T17:05:03.454Z · LW(p) · GW(p)

Aha, thanks. Is information-theoretic temperature observer-specific?

Replies from: gjm
comment by gjm · 2015-01-22T17:18:38.763Z · LW(p) · GW(p)

In the sense I have in mind, yes.

Replies from: dxu
comment by dxu · 2015-01-22T20:26:00.271Z · LW(p) · GW(p)

I am somewhat amused that you linked to the same post on which we are currently commenting. Was that intentional?

Replies from: gjm
comment by gjm · 2015-01-22T22:02:00.965Z · LW(p) · GW(p)

Actually, no! There have been kinda-parallel discussions of entropy, information, probability, etc., here and in the Open Thread, and I hadn't been paying much attention to which one this was.

Anyway, same post or no, it's as good a place as any to point someone to for a clarification of what notion of temperature I had in mind.

comment by Richard_Kennaway · 2015-01-21T11:05:32.675Z · LW(p) · GW(p)

If, on the other hand, you would call a lottery in which you know the winning numbers "unfair", you should be equally willing to call water about which you know everything "cold".

In the lottery, there is something I can do with foreknowledge of the numbers: bet on them. And with perfect knowledge of the microstate I can play Maxwell's demon to separate hot from cold. But still, I can predict from the microstate all of the phenomena of thermodynamics, and assign temperatures to all microstates that are close to equipartition (which I am guessing to be almost all of them). These temperatures will be the same as the temperatures assigned by someone ignorant of the microstate. This assignation of temperature is independent of the observer's knowledge of the microstate.

comment by asr · 2014-12-19T06:06:32.832Z · LW(p) · GW(p)

There is a peculiar consequence of this, pointed out by Cosma Shalizi. Suppose we have a deterministic physical system S, and we observe this system carefully over time. We are steadily gaining information about its microstates, and therefore by this definition, its entropy should be decreasing.

You might say, "the system isn't closed, because it is being observed." But consider the system "S plus the observer." Saying that entropy is nondecreasing over time seems to require that the observer is in doubt about its own microstates. What does that mean?