Posts
Comments
Agreed. The optimal amount of leverage is of course going to be very dependent on one's model and assumptions, but the fact that a young investor with 100% equities does better *on the margin* by adding a bit of leverage is very robust.
I endorse ESRogs' replies. I'll just add some minor points.
1. Nothing in this book or the lifecycle strategy rests on anything specific to the US stock market. As I said in my review
The fact that, when young, you are buying stocks on margin makes it tempting to interpret this strategy is only good when one is not very risk averse or when the stock market has a good century. But for any timehomogeneous view you have on what stocks will do in the future, there is a version of this strategy that is better than a conventional strategy. (A large fraction of casual critics seem to miss this point.)
If you are bearish on stocks as a whole, this is incorporated by you choosing a lower equity premium and hence lower overall stock allocation. This choice is independent of the central theoretical idea of the book.
2. Yours is a criticism of all modeling and is not specific to the lifecycle strategy.
3. As ESRogs mentioned, neither this book nor my review has the timing you suggest, so the psychoanalysis of proponents of this strategy appears inconsistent.
4. I acknowledged this sort of argument in my review, and indeed argued that the best approaches hinges on such correlations. But consider: even in the extreme case where I believes my future income is highly correlated with the stock market and is just as volatile, the lifecycle strategy recommends that my equity exposure should start low when I'm young and then increase with age, in opposition to conventional strategies! So even if you take a different set of starting assumptions from the authors, you still get a deep insight from their basic framework.
The problem is that there are other RNA viruses besides SARSCoV2, such as influenza, and depending when in the disease course the samples were taken, the amount of irrelevant RNA might exceed the amount of SARSCoV2 RNA by orders of magnitude
There is going to be tons of RNA in saliva from sources besides SARSCoV2 always. Bits of RNA are floating around everywhere. Yes, there is some minimum threshold of SARSCoV2 density at which the test will fail to detect it, but this should just scale up by a factor of N when pooling over N people. I don't see why other RNA those people have will be a problem any more than the other sources of RNA in a single person are a problem for a nonpooled test.
"The government" in the US certainly doesn't have the authority to do most of these things.
Both the federal and state governments have vast powers during public health emergencies. For instance, the Supreme Court has made clear that the government can hold you down and vaccinate you against your will. Likewise, the Army (not just National Guard) can be deployed to enforce laws, including curfew and other quarantine laws.
Yes, it's unclear whether government officials would be willing to use these options, and how much the public would resist them, but the formal authority is definitely there.
Hi Rohin, are older version of the newsletter available?
Also:
This sounds mostly like a claim that it is more computationally expensive to deal with hidden information and long term planning.
One consideration: When you are exploring a tree of possibilities, every bit of missing information means you need to double the size of the tree. So it could be that hidden information leads to an exponential explosion in search cost in the absence of hiddeninformationspecific search strategies. Although strictly speaking this is just a case of something being "more computationally expensive", exponential penalties generically push things from being feasible to infeasible.
What is the core problem of your autonomous driving group?!
Marshall, I would keep in mind that good intentions are not sufficient for getting your comments upvoted. They need to contribute to the discussion. Since your account was deleted, we can't to judge one way or the other.
I think there is some truth to Marshall's critique and that the situation could be easily improved by making it clear (either on the "about" page or in some other highvisibility note) what the guidelines for voting are. That means guidelines would have to be agreed upon. Until that happens, I suspect people will continue to just vote up comments they agree with, stifling debate.
I've previously suggested a change to the voting system, but this might require more manpower to implement than is available.
It seems like the only criterion for the rating of comment/post be the degree to which it contributes to healthy discussion (wellexplained, ontopic, not completely stupid). However, there is an strong tendency for people to vote comments based on whether they disagree with them or not, which is very bad for healthy discussion. It discourages new ideas and drives away visitors with differing opinions when they see a page full of highly rated comments for a particular viewpoint (cf. reddit).
The feature I would recommend most for this website is a dual voting feature: one vote up/down for the quality of the post/comment, and one for whether you agree or disagree with it. This would allow quality, disagreeable comments to float to the top while allowing everyone to satisfy their urge to express their opinion. It also would force people to make a cognitive distinction between the two categories.
Even people like me who try to base their ratings independent of their agreement with the comment are biased in their assessment of the quality. It would be very healthy to read a comment you agree with and would normally upvote (because your quality standards have been biased downward) only to see that a large fraction of the community finds the argument poor.
Incidentally, you might allow voting for humor or ontopicness so that people can (say) still be funny every once in a while without directly contributing to the current discussion per se.
(Sorry that was so long. It was something I had been thinking about for awhile.)
I'm confused. What is the relationship between Alcor and the Cryonics Institute? Is it eitheror? What is the purpose of yearly fees to them if you can just take out insurance which will cover all the costs in the event of your death?
Eliezer, I believe that your belittling tone is conducive to neither a healthy debate nor a readable blog post. I suspect that your attitude is borne out of just frustration, not contempt, but I would still strongly encourage you to write more civilly. It's not just a matter of being nice; rudeness prevents both the speaker and the listener from thinking clearly and objectively, and it doesn't contribute to anything.
Günther: Of course my comments about Barbour were (partially) ad hominem. The point was not to criticize his work, but to criticize this post. Very few people are qualified to assess the merit of Barbour's work. This includes, with respect, Eliezer. In the absence of expertise, the rational thinker must defer to the experts. The experts have found nothing of note in Barbour's work.
Albert Einstein was not performing philosophy when he developed GR. He was motivated by a philosophical insight and then did physics.
You've drawn many vague conclusions (read: words, not equations or experimental predictions) about the nature of reality from a vague idea promoted by a nonacademic. It smacks strongly of pseudoscience.
Julian Barbour's work is unconventional. Many of his papers border on philosophy and most are not published in prominent journals. His first idea, that time is simply another coordinate parameterizing a mathematical object (like a manifold in GR) and that it's specialness is an illusion, is ancient. His second idea, that any theory more fundamental than QM or GR will necessarily feature time only in a relational sense (in contrast to the commonly accepted, and beautiful, gauge freedom of all time and space coordinates) is interesting and possibly true, but it is most likely not profound. I can't read all of his papers, so perhaps he has some worthwhile work.
This post, however, appears to be completely without substance. What is the point?

That the universe as we understand it is best described by a timeless mathematical object (e.g. a manifold equipped with some quantum fields)? This viewpoint, taken by most physicists, has been around since at least the 1920's. While profound, it has little to do with Barbour's work, which seems to be the focus of this post.
That the next theory of physics should be expressed with a "relational approach"? This is a nice idea, but one which has (to my knowledge) produced no objective progress in formulating a successor to GR or QM. There are a thousand approaches out there with similar promise...and similar results. I can't help but feel that you are wading into waters which are above your expertise.
I definitely agree that there is truth to Max Planck's assertion. And indeed, the Copenhagen interpretation was untenable as soon as it was put forth. However, Everett's initial theory was also very unsatisfying. It only became (somewhat) attractive with the much later development of decoherence theory, which first made plausible the claim that nocollapse QM evolution could explain our experiences. (For most physicists who examine it seriously, the claim is still very questionable).
Hence, the gradual increase in acceptance of the MW interpretation is a product both of the old guard dying off and the development of better theoretical support for MW.
PsyKosh: Oh, I almost forgot to answer your questions. Experimental results are still several years distant. The basic idea is to fabricate a tiny cantilever with an even tinier mirror attached to its end. Then, you position that mirror at one end of a photon cavity (the other end being a regular fixed mirror). If you then send a photon into the cavity through a halfsilvered third mirrorso that it will be in a superposition of being in and not in the cavitythen the cantilever will be put into a correlated superposition: it will be vibrating if the photon is in the cavity and it will be still if the photon is not. Of course, the really, really superhard part is getting all this to happen without the state decohering before you see anything interesting.
Robin Z: The motivation for suspecting that something funny happens as you try scale up decoherance to full blown manyworlds comes from the serious problems that manyworlds has. Beyond the issue with predicting the Born postulate, there are serious conceptual problems with defining individual worlds, even emergently.
The motivation for doing this experiment is even more clear: (1) The manyworlds interpretation is a fantastically profound statement about our universe and therefore demands that fantastic experimental work be done to confirm it as best as is possible. (For instance, despite the fact that I very confidently expect Bell's inequality to continue to hold after each tenuous experimental loophole is closed, I still consider it an excellent use of my tax dollars that these experiments continue to be improved). (2) Fundamental new regimes in physics should always be probed, especially at this daunting time in the history of physics where we seem to be able to predict nearly everything we see around us but unable to extend our theories to inprincipally testable but currently inaccessible regimes. (3) It's just plain cool.
PsyKosh: It is an awesome experiment. Here are links to Bouwmeester's home page , the original proposal, and the latest update on cooling the cantilever.(Bouwmeester has perhaps the most annoying web interface of any serious scientist. Click in the upper left on "research" and then the lower right on "macroscopic quantum superposition". Also, the last article appeared in nature and may not be accessible without a subscription.)
Obviously, this is a very hard experiment and success is not assured.
Also, you might be interested to know that at least one other group, Jack Harris's at Yale, is doing similar work.
Excellent post Eliezer. I have just a small quibble: it should be made clear that decoherance and the many worlds interpretations are logically distinct. Many physicists, especially condensed matter physicist working on quantum computation/information, use models of microscopic decoherance on a daily basis while remaining agnostic about collapse. These models of decoherance (used for socalled "partial measurement") are directly experimentally testable.
Maybe a better term for what you are talking about is macroscopic decoherance. As of right now, no one has ever created serious macroscopic superpositions. Macroscopic decoherance, and hence the many worlds interpretation, rely on extrapolating microscopic observed phenomena.
If there's one lesson we can take from the history of physics, it's that everytime new experimental "regimes" are probed (e.g. large velocities, small sizes, large mass densities, large energies), phenomena are observed which lead to new theories (special relativity, quantum mechanics, general relativity, and the standard model, respectively). This is part of the reason I find it likely that the peculiar implications of uncollapsed hermitian evolution are simply the artifacts of using quantum mechanics outside its regime of applicability.
Here at UC Santa Barbara, Dirk Bouwmeester is trying to probe this macroscopic regime by superposing a cantilever that is ~50 microns acrossbig enough to see with an optical microscope!
"And both spatial infinity and inflation are standard in the current model of physics."
As mentioned by a commenter above, spatial infinity is by no means required or implied by physical observation. Noncompact spacetimes are allowed by general relativity, but so are compact tori (which is a very real possibility) or a plethora of bizarre geometries which have been ruled out by experimental evidence.
Inflation is an interesting theory which agrees well with the small (relative to other areas of physics) amount of cosmological data which has been collected. However, the data by no means implies inflation. In fact, the term "inflation" refers to a huge zoo of models which have many unexplained parameters which can be tuned to fit the date. Physicists are far from absolutely confident in the inflationary picture.
Furthermore, there are serious, serious problems with Many Worlds Interpretation (and likewise for Mangled Worlds), which you neglect to mention here.
I enjoy your take on Quantum Mechanics, Eliezer, and I recommend this blog to everyone I know. I agree with you that Copenhagen untenable and the MWI is the current best idea. But you talk about some of your ideas like it's obvious and accepted by anyone who isn't an idiot. This does your readers a disservice.
I realize that this is a blog and not a refereed journal, so I can't expect you to follow all the rules. But I can appeal to your commitment to honesty in asking you to express the uncertainty of your ideas and to defer when necessary to the academic establishment.
Eliezer:I wouldn't be surprised to learn that there is some known better way of looking at quantum mechanics than the position basis, some view whose mathematical components are relativistically invariant and locally causal. There is. Quantum Field theory takes place on the full spacetime of special relativity, and it is completely lorentz covariant. Quantum Mechanics is a lowspeed approximation of QFT and neccessarily chooses a reference frame, destroying covariance.
Hal Finney: The Schrodinger equation (and the relatavistic generalization) dictate local evolution of the wavefunction. Nonlocality comes about during the measurement process, which is not well understood.
CPT symmetry is required by Quantum Field Theory, not General Relativity.
The Feynman path integral (PI) and Schrödinger's equation (SE) are completely equivalent formulations of QM in the sense that they give the same time evolution of an initial state. They have exactly the same information content. It's true that you can derive SE from the PI, while the reverse derivation isn't very natural. On the other hand, the PI is mathematically completely nonrigorous (roughly, the space of paths is too large) while SE evolution can be made precise.
Practically, the PI cannot be used to solve almost anything except the harmonic oscillator. This is a serious handicap in QM, since SE can be used to solve many problems exactly. But in quantum field theory, all the calculations are perturbations around harmonic oscillators, so the PI can be very useful.
Many physicists would agree that the PI is more "fundamental" because it's gives insight into QFT and theoretical physics. But the distinction is largely a matter of taste.
PsyKosh: Positionspace is special because it has a notion of locality. Two particles can interact if they collide with each other traveling at different speeds, but they cannot interact if they are far from each other traveling at the same speed.
The field, defined everywhere on the 4D spacetime manifold, is "reality" (up until the magical measurement happens, at least). You can construct different initial value problem (e.g. if the universe is suchandsuch at a particular time, how will it evolve?) by taking different slices of the spacetime. Just because there are are many ways to pose an initial value problem for the same spacetime history doesn't mean there isn't one field which is reality.
Eliezer is obviously unable to address all these issues here, as they are well outside his intended scope.
Chris, in case you didn't see me ask you last time...
http://www.overcomingbias.com/2008/04/philosophymeet.html#comment110472438
do you know of a good survey of decoherence?
PsyKosh: In Quantum Field Theory, the fields (the analog of wavefunctions in nonrelativistic Quantum Mechanics) evolve locally on the spacetime. This is given a precise, observerindependant (i.e. covariant) meaning. This property reduces to the spatiallylocal evolution of the wavefunction in QM which Eliezer is describing. Further, this indeed identifies positionspace as "special", compared to momentumspace or any other decomposition of the Hilbert space.
Eliezer: The wavefunctions in QM (and the fields in QFT) evolve locally under normal (Hermitian) evolution. However, Belltype experiments show that wavefunction collapse is a nonlocal process (be it the preposterous Copenhagenstyle collapse, or some flavor of decoherence). As far as I have read, the source of this nonlocality is not understood.
Chris, could you recommend an introduction to decoherence for a grad student in physics? I am dumbstruck by how difficult it is to learn about it and the seeming lack of an authoritative consensus. Is there a proper review article? Is fullon decoherence taught in any physics grad classes, anywhere?
PsyKosh: I have never heard of anyone ever successfully formulating quantum (or classical) mechanics without the full spectrum of real numbers. You can't even have simple things, like right triangles with noninteger side length, without irrational numbers to "fill in the gaps". Any finiteset formulation of QM would look very different from what we understand now.
PsyKosh: I have never heard of anyone ever successfully formulating quantum (or classical) mechanics without the full spectrum of real numbers. You can't even have simple things, like right triangles with noninteger side length, without irrational numbers to "fill in the gaps". Any finiteset formulation of QM would look very different from what we understand now.
PsyKosh, when QM is formulated rigorously (something that is rarely done, and only by mathematical physicists) the amplitudes must be able to take on any number in the complex plane, not just the rationals.
Sebastian Hagen, I believe Eliezer is explaining to us the best model physicists have for the way the world works on the (sorta) lowest level we understand, not his personal beliefs on the nature of reality. This model must include the irrationals, to be selfconsistent. This does not prevent the universe from being discretized (no uncountable sets) on a more fundamental level from QM.
I guess, Eliezer, that I would be concerned about convincing everyone that the universe runs along like a computer, computing amplitudes locally (which seems to be the gist of your discussion). To do so would certainly make people feel like QM isn't confusing; it would just be wave mechanics. But this would give people a false confidence, I think, and is not how the universe appears to operate.
But this is the first post, so I'll try to confine my criticism until you've wrapped up your discussion.
Eliezer, in case you plan to discuss Bell'sinequalitytype experiments in future posts, I suggest that you use the GHZ state (not the EPR pair) to show how local realism is ruled out in QM. The GHZ state is a much cleaner result, and is not obscurred by the statistics inherent in Bell's inequality.