New Year's Prediction Thread (2014)

post by Thomas · 2014-01-01T09:38:03.372Z · LW · GW · Legacy · 62 comments

Contents

62 comments

It's time to look back to see what was predicted a year ago and how successfully it was.

But even more, it's time for the fresh predictions for the following year, 2014.

62 comments

Comments sorted by top scores.

comment by James_Miller · 2014-01-01T22:02:37.898Z · LW(p) · GW(p)

Nicholas Wade will get fired from the New York Times because of his book A Troublesome Inheritance: Genes, Race, and Human History. (Probability 30%)

Replies from: MichaelAnissimov, gwern, MichaelAnissimov, MichaelAnissimov
comment by MichaelAnissimov · 2014-05-11T15:24:03.874Z · LW(p) · GW(p)

This has happened: 30-year New York Times Science Writer Out After Writing Book About Genetics, Race.

Replies from: Kaj_Sotala
comment by Kaj_Sotala · 2014-05-12T04:33:45.023Z · LW(p) · GW(p)

From that page:

Correction: Wade tells blogger Luke Ford that he retired from the Times 2 years ago, but still contributes articles to the paper. Neither Wade nor the Times returned earlier requests for comment on the matter.

Replies from: gwern, MichaelAnissimov
comment by gwern · 2014-05-14T15:37:31.637Z · LW(p) · GW(p)

Luke Ford source http://www.lukeford.net/blog/?p=54601 :

[Wade:] “I retired from the Times about two years ago. There’s a blogosphere story today that the Times didn’t like the book and fired me, but the writer invented the whole thing based on his having seen the words ‘former Science editor’ in the piece I did in Time.”

...Luke: “You will be writing future articles for the New York Times?”

Nicholas: “I assume so. I write for them quite regularly on a contract basis but I am not on their staff any longer.”

comment by MichaelAnissimov · 2014-05-12T09:00:44.364Z · LW(p) · GW(p)

I propose the prediction be amended to, "The New York Times will never accept articles from Nicholas Wade again."

comment by gwern · 2014-05-11T16:20:27.539Z · LW(p) · GW(p)

I gave it 40% after reading the advance reviews. That was even faster than I had expected.

comment by MichaelAnissimov · 2014-05-11T17:44:26.266Z · LW(p) · GW(p)

Apparently the Daily Caller article is mistaken, Wade took a retirement package two years ago and is now a science writer rather than science editor, according to Charles Murray here. So, the jury is still out on this one, we will wait and see if he is fired from being a science writer.

Replies from: Douglas_Knight
comment by Douglas_Knight · 2014-05-11T18:45:41.058Z · LW(p) · GW(p)

Wade stopped being science editor in 1997 (so that he could write articles), so it's pretty weird that the Time byline was "former science editor," regardless of how Wade's position changed in 2012 or last week.

comment by MichaelAnissimov · 2014-01-05T02:57:44.255Z · LW(p) · GW(p)

I agree with this prediction.

comment by satt · 2014-01-04T13:27:01.847Z · LW(p) · GW(p)

The mean IQ reported in the latest LW survey will be 137 or less (probability 75%).

Thought I'd squeeze that in before Yvain posts the results. Some background. In the 2012 survey LWers reported having a mean IQ of 138.7. I found (and find) that unlikely, and reckon that average was inflated through a mixture of selective reporting, mis-remembering, perhaps the occasional lie, and people taking old or otherwise iffy tests. However, the 2013 survey should suffer much less from that last problem because (if I remember rightly) it asks only about formally assessed IQs. So I expect the 2013 IQ average to be less (because less inflated) than the 2012 IQ average.

I'm not super confident about this, because there's some countervailing evidence (using self-reported SAT & ACT scores from the 2012 survey as IQ proxies suggests higher IQ estimates, not lower estimates), and because a selection effect could actually worsen the inflation in the 2013 survey (people with higher IQs might be more likely to have their IQs formally tested; if so, asking only for properly measured IQs will selectively pick out high-IQ people). But my hunch is that these issues won't matter so much.

Replies from: satt, Ander, gothgirl420666
comment by satt · 2014-01-20T01:42:11.659Z · LW(p) · GW(p)

Prediction failed.

(As well as the headline result that the average turned out to be 138.2, see also section V.B, and Vaniver's comment. Also, it looks like I may have been wrong to take Yvain's SAT-and-ACT-to-IQ conversions from last year at face value in the parent comment.)

comment by Ander · 2014-01-08T00:36:24.569Z · LW(p) · GW(p)

When will we get to see the results of the LW survey?

Replies from: satt
comment by satt · 2014-01-08T01:21:12.504Z · LW(p) · GW(p)

No idea (Yvain hasn't said yet), but presumably some time this year!

But I can try to guess. In 2012, Yvain apparently closed the survey on November 26 and reported the results on December 7. In 2011, the survey closed on December 3 and Yvain reported results on December 5. And in 2009, Yvain announced the survey on May 3 and posted its results on May 12. So for those surveys, it's been 11 days, 2 days, and ≈9 days between when the survey closed and when Yvain posted results, and I'd guess that 2-11 days after Yvain closes last year's survey (which he hasn't yet), he'll post its results.

(Maybe I should make that a formal prediction? "2-11 days after Yvain closes the current LW survey, he'll post the results (80%).")

comment by gothgirl420666 · 2014-01-28T23:24:53.855Z · LW(p) · GW(p)

A 138 average doesn't seem far-fetched at all to me. A little bit of self-serving bias is inevitable, but I highly doubt the real average is e.g. in the 120s. This random website I found says that the average IQ of an Ivy League student is 142. I go to a school that isn't as good as most Ivys but is better than some of them. I would guess the average IQ of a student here is 135-ish. The average LW poster seems much, much smarter to me than the average person at my school.

Replies from: satt
comment by satt · 2014-01-30T01:13:34.070Z · LW(p) · GW(p)

Yeah, I've moved a bit towards your sort of position because of the 2013/14 results. That said, I don't have an impression of LWers being way, way smarter than other students I encounter in real life.

(I'm also still leery of the poor correlation between education level and IQ, which cropped up again in the latest survey. To go into tedious detail: among those aged ≥29, 25 people with a high school education or less gave a mean IQ of 139.5, and 155 people with more education gave a mean IQ of 140.2. And Nornagest's suggestion to look at the high end now gives a less statistically significant result than last year. The 24 oldsters with PhDs gave a mean IQ of 142.4, and the other 156 non-PhDs gave a mean IQ of 139.8.)

comment by [deleted] · 2014-01-01T10:01:01.878Z · LW(p) · GW(p)

Bitcoin will surpass $5,000 (for example, due to the opening of one or more easily assessible exchange traded funds).

Release of the consumer version of the Occulus Rift (if it happens in 2014) will bring back the 90's optimism for augmented and virtual reality. Someone will release a "metaverse" client for a peer-to-peer, decentralized, distribed virtual world.

ETA: Clarified "due to" construction of prediction.

Replies from: somervta, bramflakes, passive_fist, JTHM
comment by somervta · 2014-01-01T10:52:47.948Z · LW(p) · GW(p)

Probability estimates?

Replies from: gjm, None, None
comment by gjm · 2014-01-01T14:15:54.667Z · LW(p) · GW(p)

I predict that there will be disappointingly few of these.

Replies from: Benito
comment by Ben Pace (Benito) · 2014-01-03T14:13:58.696Z · LW(p) · GW(p)

And those that there are will be pretty inaccurate (100%).

comment by [deleted] · 2014-01-02T16:18:06.322Z · LW(p) · GW(p)

I predict there's about 50% chance the price of bitcoin is going to rise and about 50% chance it's going to fall. Anyone wanna bet me on this?

comment by [deleted] · 2014-01-01T18:47:20.032Z · LW(p) · GW(p)

Better than even odds or I wouldn't have said it. Any more precision than that would be intellectual dishonesty.

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2014-01-01T21:56:59.818Z · LW(p) · GW(p)

How much Bitcoin do you own?

Replies from: None
comment by [deleted] · 2014-01-01T22:36:00.348Z · LW(p) · GW(p)

A few dozen although I'm burning through it quickly. I make my living paid in bitcoin (I'm a bitcoin-core developer living off donations) so that isn't so much an investment as my current runway. Since I don't think any ETF will get online until Summer at the earliest, and my runway isn't that long, I doubt that I'll make much money off the rise. Assuming I keep getting donations, it'll be just a few months float at the most.

comment by bramflakes · 2014-01-01T15:38:01.063Z · LW(p) · GW(p)

What about Bitcoin surpassing 5k for any reason?

Replies from: gjm
comment by gjm · 2014-01-01T18:37:35.169Z · LW(p) · GW(p)

Much less likely, obviously...

Replies from: None
comment by [deleted] · 2014-01-01T18:45:59.193Z · LW(p) · GW(p)

I think you have that reversed. How can the statement "Bitcoin will surpass $5,000" be less probable than the statement "Bitcoin will surpass $5,000, due to X"?

Replies from: Vaniver
comment by Vaniver · 2014-01-01T19:09:04.178Z · LW(p) · GW(p)

I think you have that reversed.

I strongly suspect it's a joke about the conjunction fallacy.

The prediction that Bitcoin passes $5k, that there's a bitcoin ETF, and that both happen, are all interesting probabilities; the claim that the conjunction will happen (and specifically that the bitcoin ETF will be the primary cause of the Bitcoin price increase) seems like a conjunction fallacy prompt, that the additional detail is there to make it seem more credible rather than less. (Otherwise, you should use a conditional- if a bitcoin ETF exists, then the price will likely top $5k.)

Replies from: gjm, Eugine_Nier
comment by gjm · 2014-01-01T20:20:53.685Z · LW(p) · GW(p)

a joke about the conjunction fallacy.

Yep, exactly. (More precisely, bramflakes was making a point about the conjunction fallacy and I was reinforcing it with a bit of irony. Trying to, anyway.)

Replies from: None
comment by [deleted] · 2014-01-03T19:42:23.310Z · LW(p) · GW(p)

I'm not sure the conjunction fallacy should be of operational importance to the giving of predictions, unless there is betting involved. What should I do instead, simply state "Bitcoin will rise to $5,000."? That's completely uninteresting, and I fail to see how anyone could judge that prediction based on anything other than my credentials and their prejudices. Saying "Bitcoin will rise to $5,000 due to X." gives people a window into my thinking and lets them judge for themselves whether my assessment is likely.

Replies from: army1987, gjm
comment by A1987dM (army1987) · 2014-01-03T20:38:42.198Z · LW(p) · GW(p)

You could say “Bitcoin will rise to $5,000 (for example, due to X).”

comment by gjm · 2014-01-03T20:04:01.320Z · LW(p) · GW(p)

There's nothing at all wrong with making specific predictions. But it should be done with care, in view of our brains' tendency to infer higher rather than lower probability when they see something more specific.

comment by Eugine_Nier · 2014-01-01T21:10:06.791Z · LW(p) · GW(p)

Also, it's nearly impossible to figure out what caused a price movement, even after the fact.

Replies from: None
comment by [deleted] · 2014-01-01T22:38:00.984Z · LW(p) · GW(p)

That's true of index funds, not of individual assets, and demonstrably false in the case with a highly illiquid asset like bitcoin.

comment by passive_fist · 2014-01-01T11:45:27.881Z · LW(p) · GW(p)

The problem with all this 3d headwear seems, to me at least, to be that they don't really offer any substantial improvement over a monitor and mouse. Our brains don't need stereoscopic displays to percieve a 3d world. Our brains are very very good at building up a 3d representation of a world from just a 2d image (something that comes in very handy in the real world when one of your eyes is closed or non-functioning). And moving around a 3d world with your hand seems to be about the same, if not easier, level of difficulty than moving around with your neck. And the main disadvantage with headwear is discomfort.

I'd give the Oculus Rift a 50% chance of success.

At least augmented reality headwear (such as google glass or the very amazing Meta Spaceglasses: https://www.spaceglasses.com/ ) have something to offer that can't be had by a traditional monitor. However, it still remains to be seen how much people will actually desire those things. I can definitely imagine the Spaceglasses being widely used by creative professions.

EDIT: Changed 'fatigue' to 'discomfort'.

Replies from: Risto_Saarelma, Hyphen-ated
comment by Risto_Saarelma · 2014-01-01T15:19:28.725Z · LW(p) · GW(p)

Have you tried an Oculus Rift? I did, and I had the same "this is awesome!" reaction most people seem to report. Having more 3d space show up when you turn your head around is a big deal, as is having the 3d world take over your entire field of view.

There might be fatigue problems that show up with long-term use that we haven't seen yet, and strange cultural reactions to the way the headset user becomes totally isolated from the surrounding real world, but the initial reaction where almost everyone who tries it on thinks it's awesome predicts at least a fad success.

Replies from: passive_fist
comment by passive_fist · 2014-01-01T19:21:31.483Z · LW(p) · GW(p)

Initial reactions do not seem to be a good predictor of success. After the initial novelty wears off, users do in fact report problems such as low resolution and discomfort (dizziness, headaches, vertigo, and nausea). See for instance this review and many others that can be found with simple google search.

3d television users also initially reported "this is awesome!" reactions (for similar reasons), but it does not seem to have caught on (also for similar reasons: poor resolution and discomfort).

As mentioned in that review, it also depends on how the technology is used. Game developers have to take steps to reduce discomfort and to use the technology in novel ways. If that is done, then I agree that the chance of success becomes much larger.

Replies from: None
comment by [deleted] · 2014-01-01T19:48:57.148Z · LW(p) · GW(p)

After the initial novelty wears off, users do in fact report problems such as low resolution and discomfort (dizziness, headaches, vertigo, and nausea).

Most of which is due to limitations in the devkit model (lack of degrees of freedom in head tracking and low resolution), all of which are being fixed in the consumer model. Reviews of the consumer model prototypes tested at conventions / press events have reported these symptoms are gone.

When I made my prediction I called out a Snow Crash like metaverse as the killer app. More generally, I think we will be seeing applications of head-mounted VR that are surprising, novel, and ultimately far more interesting than gaming. Occulus Rift will be, I think, a transformative technology in general, even if it ends up controversial or marginalized in gaming.

Replies from: passive_fist
comment by passive_fist · 2014-01-01T19:57:36.673Z · LW(p) · GW(p)

Reviews of the consumer model prototypes tested at conventions / press events have reported these symptoms are gone.

In that case, the chances of success look much better.

More generally, I think we will be seeing applications of head-mounted VR that are surprising, novel, and ultimately far more interesting than gaming.

Can you give some examples?

Replies from: None, wwa
comment by [deleted] · 2014-01-01T20:48:04.753Z · LW(p) · GW(p)

Besides the metaverse I've already mentioned, here's another one:

Through my work I've been fortunate enough to be able to use CAVE environments developed at UC Davis and UC San Diego in analysis of planetry data. Search for "3d CAVE" on youtube and you should fine plenty of videos showing what this experience is like.

The effect of being able to immersively interact with this data is incredible. The classic example I gave visitors was some of the first published data to come out of the UC Davis computer science / geology visualization partnership: a buckling of subduction zones that was previously unknown despite having sufficient data available for the last century at least. They loaded earthquake data overlayed on a globe basically as a test of the system, and almost immediately discovered the subduction buckling from straight visual inspection.

Analyzing geometric data directly in an immersive 3D environment is so much more productive than traditional techniques, because it exploits the natural machinery we have inbuilt in us for aggregating and extracting out details of sensory data. Already it sees use in many areas - I sat next to someone on a plane once who's job it was to install these things in oil exploration ships, where the energy companies would use it to quickly analyze the terrabytes of data coming in from the sea bed.

I expect that in nearly all fields of engineering, physical science, and biology there are great efficiencies to be gained by utilizing the immersive CAVE experience. But a traditional CAVE will cost you half a million dollars, putting it way outside the reach of most organizations. An Occulus Rift + Kinect + decent graphics card puts you back less than a thousand dollars, on the other hand.

(BTW, experience in immersive CAVE environments is that with suitable precision and capability in the technology motion sickness-like symptoms disappear for all but a few percent of the population)

Replies from: passive_fist
comment by passive_fist · 2014-01-01T20:50:56.800Z · LW(p) · GW(p)

I actually agree with you here. As I mentioned in my first reply, I can easily imagine virtual/augmented reality headsets being used for creative professions, and I can also easily imagine them being used for science/engineering and so on. It's just hard for me to imagine them being widely used in gaming, at least in their current form. Maybe future, more advanced iterations of the technology would have better chances.

comment by wwa · 2014-01-01T20:11:15.604Z · LW(p) · GW(p)

How about an architect walking his clients though their soon-to-be house?

Replies from: passive_fist
comment by passive_fist · 2014-01-01T20:47:18.104Z · LW(p) · GW(p)

What makes the Oculus Rift special in that regard? There have been numerous head-mounted VR solutions that have been able to do that for many years. Yet they have not seen any serious use for such purposes.

Replies from: jacob_cannell
comment by jacob_cannell · 2014-01-05T02:25:30.646Z · LW(p) · GW(p)

Have you tried it?

The Rift is different in that it provides full hemisphere viewing angle. There is no 'tunnel vision', and you get full peripheral vision. Peripheral vision is important to the HVS for motion sensation and situational awareness.

Its immediately different as soon as you turn your head, there is a definite wow factor over a monitor.

The tradeoff of course is the terrible resolution, but its interesting in showing the potential of at leas solving most of the other immersion problems.

Replies from: None
comment by [deleted] · 2014-01-05T09:52:14.284Z · LW(p) · GW(p)

The tradeoff of course is the terrible resolution, ...

Solved in the consumer version which is still being worked on (at least 1080p in each eye).

Replies from: jacob_cannell
comment by jacob_cannell · 2014-06-03T07:13:40.511Z · LW(p) · GW(p)

1080p in each eye is hardly enough to 'solve' the resolution problem. There is a fundamental tradeoff between FOV and effective resolution - a reason why other manufactures haven't attempted full human FOV. For a linear display its something like 8k x 4k per eye for a full FOV HMD to have HDTV equivalent resolution.

comment by Hyphen-ated · 2014-01-01T15:02:35.456Z · LW(p) · GW(p)

The big advantage over a monitor is immersion. When I tried out an oculus rift I felt like I was inside the virtual space in a way that I've never felt while playing FPSes on a monitor. That's not a small thing.

Another advantage is that it increases how many input axes you have. Think of games where you're flying a spaceship or driving a car and you can freely look in all directions while controlling your vehicle with both hands. That's impossible on a standard monitor.

Replies from: passive_fist
comment by passive_fist · 2014-01-01T19:24:07.435Z · LW(p) · GW(p)

It's not impossible. Games frequently allow you to use the arrow keys to move around while using the mouse to change the view direction (or vice versa).

Replies from: Hyphen-ated
comment by Hyphen-ated · 2014-01-02T03:04:51.610Z · LW(p) · GW(p)

I know that; I've played FPSes with that control layout for thousands of hours. I said "while controlling your vehicle with both hands" which means, for example, with a steering wheel, a throttle+joystick, or a keyboard+mouse with the mouse controlling something besides camera angle.

comment by JTHM · 2014-01-05T05:24:35.600Z · LW(p) · GW(p)

The present value of a commodity reflects the market's best estimate as to the future value of that commodity. You are not smarter than the market; practically nobody is. If the market value of Bitcoin is X, then something not far from X is the best estimate of Bitcoin's near-future value. (The very best guess isn't exactly X because of cost of liquidity and time preferences.)

Replies from: None
comment by [deleted] · 2014-01-06T16:48:31.763Z · LW(p) · GW(p)

People have access to different sets of information (particularly in non-regulated markets), come to the table with different priors, and have very different time preferences. individual investor estimates are therefore all over the map and often time dependent, which is why there is any trade at all (if everyone felt the same way and never changed their minds, the market would quickly stabilize and the volume drop to zero). For these reasons it's fallacious to try to aggregate and extrapolate estimations of future value from current market prices.

comment by [deleted] · 2014-01-03T15:01:44.302Z · LW(p) · GW(p)

.

Replies from: NancyLebovitz, Ander
comment by NancyLebovitz · 2014-01-07T10:57:24.108Z · LW(p) · GW(p)

What do you mean by human-level?

Replies from: None
comment by [deleted] · 2014-01-07T15:09:40.727Z · LW(p) · GW(p)

.

comment by Ander · 2014-01-08T00:37:56.207Z · LW(p) · GW(p)

5% chance of human level AI this year seems extremely high to me. What are you basing that on?

Replies from: None
comment by [deleted] · 2014-01-08T00:47:27.398Z · LW(p) · GW(p)

.

comment by JQuinton · 2014-01-02T14:53:29.141Z · LW(p) · GW(p)

Marijuana is legalized/decriminalized in a New England state or another west coast state in the USA (35%).

Replies from: Douglas_Knight, James_Miller
comment by Douglas_Knight · 2014-01-04T20:23:34.104Z · LW(p) · GW(p)

If you are distinguishing between legalization and decriminalization, hasn't it already been decriminalized in all the states you mention?

comment by James_Miller · 2014-01-02T18:11:46.476Z · LW(p) · GW(p)

Will it still be illegal under federal law? States don't have the power to remove all criminal penalties for marijuana use.

comment by advancedatheist · 2014-01-01T15:07:17.880Z · LW(p) · GW(p)

1, Florida's new law on anatomical donations will prevent a cryonicist who dies in that state from going into cryo.

Reference:

http://groups.yahoo.com/neo/groups/New_Cryonet/conversations/messages/5925

  1. An authority figure in the cryonics movement will finally acknowledge that Drexler's "nanotechnology" simply can never exist because it gets the physics wrong, and he recommends that cryonics organizations stop invoking the idea as the revival mechanism. Continuing to rely on "nanotechnology" propaganda leaves cryonics organizations open to accusations of knowingly practicing fraud based on pseudo- and cargo cult science.

  2. We'll see a divergence in public discussions about "the future," where one group continues to promote accelerationist claims, while another asks why "the future" the former keeps promising us hasn't arrived yet.

And I keep asking that myself. My father took me to see Stanley Kubrick's famous film at a theater in Tulsa back in 1968 (quite an adventure for a geeky 8 year old Okie boy!), and I can remember thinking that the year 2001 seemed like a wondrous, far-off future time. Living thirteen years afterwards of that date in reality, I feel a bit cheated.

Replies from: None, None
comment by [deleted] · 2014-01-01T18:51:01.357Z · LW(p) · GW(p)

An authority figure in the cryonics movement will finally acknowledge that Drexler's "nanotechnology" simply can never exist because it gets the physics wrong.

Please provide an explanation or citation. In the mean time, here is a growing list of peer-reviewed publications of ab-initio quantum simulations of Drexler-esque diamondoid mechanosynthesis:

http://www.molecularassembler.com/Nanofactory/Publications.htm

comment by [deleted] · 2014-01-02T02:05:38.031Z · LW(p) · GW(p)

An authority figure in the cryonics movement will finally acknowledge that Drexler's "nanotechnology" simply can never exist because it gets the physics wrong, and he recommends that cryonics organizations stop invoking the idea as the revival mechanism.

Hasn't Mike Darwin been doing this for years? (And for good reason?)

We'll see a divergence in public discussions about "the future," where one group continues to promote accelerationist claims, while another asks why "the future" the former keeps promising us hasn't arrived yet.

Happened in my circles years ago. With the vast majority of my circle (and me) falling into the latter camp for most of the stereotypical issues/themes.