Interview with Nassim Taleb 'Trump makes sense to a grocery store owner'
post by Gunnar_Zarncke · 2017-02-08T21:52:21.606Z · LW · GW · Legacy · 22 commentsThis is a link post for http://www.thehindu.com/books/%E2%80%98Trump-makes-sense-to-a-grocery-store-owner%E2%80%99/article17109351.ece
Contents
22 comments
22 comments
Comments sorted by top scores.
comment by MrMind · 2017-02-09T09:17:01.829Z · LW(p) · GW(p)
I've always regarded Taleb as a half-crackpot, so the "After predicting the 2008 economic crisis, the Brexit vote, the U.S. presidential election and other events correctly," piece surprises me greatly.
Any idea of a source which would confirm this?
↑ comment by Daniel_Burfoot · 2017-02-09T23:20:52.844Z · LW(p) · GW(p)
always regarded Taleb as a half-crackpot
My guess is Taleb wouldn't be offended by this, and would in fact argue that any serious intellectual should be viewed as a half-crackpot.
Serious intellectuals get some things right and get some things wrong, but they do their thinking independently and therefore their mistakes are uncorralated with others'. That means their input is a valuable contribution to an ensemble. You can make a very strong aggregate prediction by calling up your half-crackpot friends, asking their opinion, and forming a weighted average.
Pseudo-intellectuals, whom Taleb calls IYIs, are just regurgitating what other people say. That means their opinions are all highly correlated. The ensemble prediction obtained by asking a lot of pseudo-intellectuals isn't much stronger than the single opinion of just one such person.
There is an ethical component to this dichotomy. A serious intellectual is risking his reputation (being perceived as a crackpot) to add aggregate strength to the collective wisdom. In other words, the serious intellectual is accepting individual fragility to make the collective anti-fragile. In contrast the pseudo-intellectual seeks to protect himself from risk, while making the collective fragile, since the collective opinion of a group of IYIs is very likely to be wrong even if (especially if!) there are many IYIs and they all agree.
Replies from: BiasedBayes↑ comment by BiasedBayes · 2017-02-10T20:05:10.172Z · LW(p) · GW(p)
Thats a way too simplistic way to think about this. One has to stand on the shoulders of giants to be intellectual in the first place. Also there is this thing called scientific consensus and there are reason why its usually rational to lean ones opinions in line with scientific consensus- not because of other people think like it too but because its usually the most balanced view of the current evidence.
Talebs argument about being IYI is pretty ridiculous and includes stuff like not deadlifting, not cursing on twitter and not drinking white wine with steak while naming some of attributes of IYI using people he does not like. I get it its partly satire but he fails to make any sharp arguments, its mostly sweeping generalisation while generating these heuristics around the concept of IYI that are grossly simplistic.
Come on : ”The IYI has been wrong, historically, on Stalinism, Maoism, GMOs, Iraq, Libya, Syria, lobotomies, urban planning, low carbohydrate diets, gym machines, behaviorism, transfats, freudianism, portfolio theory, linear regression, Gaussianism, Salafism, dynamic stochastic equilibrium modeling, housing projects, selfish gene, election forecasting models, Bernie Madoff (pre-blowup) and p-values. But he is convinced that his current position is right.”
OK.
↑ comment by hairyfigment · 2017-02-09T11:33:55.827Z · LW(p) · GW(p)
Not to put too fine a point on it, but I bet all of my money he didn't predict Trump would win the Electoral College while losing the popular vote by millions. That article gives no hint he even knows it happened. Meanwhile, Five Thirty-Eight supposedly based most of the probability mass they assigned to Trump winning on a technical EC victory, and they said he had one chance in four (one in three earlier).
The fact that a 25% chance can happen with potentially devastating consequences is roughly why Trump the outsider might blow up America, though I'd give that no more than a 13% chance.
Replies from: Douglas_Knight↑ comment by Douglas_Knight · 2017-02-12T20:29:01.989Z · LW(p) · GW(p)
538 put Trump winning popular vote at 20%. They put Trump winning EC while losing popular at 10%.
Replies from: hairyfigment↑ comment by hairyfigment · 2017-02-13T08:41:27.154Z · LW(p) · GW(p)
OK, they gave him a greater chance than I thought of winning the popular vote. I can't tell if that applies to the polls-plus model which they actually seemed to believe, but that's not the point. The point is, they had a model with a lot of uncertainty based on recognizing the world is complicated, they explicitly assigned a disturbing probability to the actual outcome, and they praised Trump's state/Electoral College strategy for that reason.
Replies from: Douglas_Knight↑ comment by Douglas_Knight · 2017-02-13T17:23:09.550Z · LW(p) · GW(p)
538's models are predictions of how the polls will change with time, so the closer to the election, the less difference there is between them, negligible in the final prediction. But if you want to see, look here. It shows predictions on the right. Near the bottom is "Trump wins popular vote." On the left, you can choose between the three models and see that it doesn't make a difference.
↑ comment by Viliam · 2017-02-09T09:40:34.501Z · LW(p) · GW(p)
Preferably to also confirm he wasn't making hundred other predictions that didn't happen.
Replies from: maxjmartin↑ comment by maxjmartin · 2017-02-09T11:45:41.163Z · LW(p) · GW(p)
Taleb's strategy is to absolutely make many bets with high payoff, low downside, expecting most of them to not pay out.
eg. http://www.newsmax.com/finance/StreetTalk/Nassim-Taleb-Shorting-Treasuries/2010/02/04/id/348993/ he was wrong, but as he puts it:
“You have a very small probability of making money,” he said. “But if you’re right, you’ll never see a public plane again.”
another quote:
"I did 700,000 trades in career, was "wrong" on between 650,000 and 695,000."
We can assume that he is either lucky, or well calibrated, since he has made quite a bit of money over the years betting on events with very long odds.
Replies from: Lumifer↑ comment by Lumifer · 2017-02-09T15:42:41.457Z · LW(p) · GW(p)
since he has made quite a bit of money over the years betting on events with very long odds
Citation needed.
I believe he tried to run a hedge fund for a while, basically buying volatility. He failed and, as far as I know, closed the fund down.
Replies from: maxjmartin↑ comment by maxjmartin · 2017-02-09T16:08:33.712Z · LW(p) · GW(p)
He might have been lucky rather than have a good strategy, yes. Hard to tell.
↑ comment by maxjmartin · 2017-02-09T11:37:31.715Z · LW(p) · GW(p)
Source for the 2008 crisis prediction would be his book The Black Swam (I remember checking the publication date when reading The Black Swan, since it did seem very prescient):
Globalization creates interlocking fragility, while reducing volatility and giving the appearance of stability. In other words it creates devastating Black Swans. We have never lived before under the threat of a global collapse. Financial Institutions have been merging into a smaller number of very large banks. Almost all banks are interrelated. So the financial ecology is swelling into gigantic, incestuous, bureaucratic banks – when one fails, they all fall. The increased concentration among banks seems to have the effect of making financial crises less likely, but when they happen they are more global in scale and hit us very hard. We have moved from a diversified ecology of small banks, with varied lending policies, to a more homogeneous framework of firms that all resemble one another. True, we now have fewer failures, but when they occur ….I shiver at the thought.
Banks hire dull people and train them to be even more dull. If they look conservative, it's only because their loans go bust on rare, very rare occasions. But (...)bankers are not conservative at all. They are just phenomenally skilled at self-deception by burying the possibility of a large, devastating loss under the rug.
The government-sponsored institution Fannie Mae, when I look at its risks, seems to be sitting on a barrel of dynamite, vulnerable to the slightest hiccup. But not to worry: their large staff of scientists deemed these events "unlikely".
There is no way to gauge the effectiveness of their lending activity by observing it over a day, a week, a month, or . . . even a century! (...) the real- estate collapse of the early 1990s in which the now defunct savings and loan industry required a taxpayer-funded bailout of more than half a trillion dollars. The Federal Reserve bank protected them at our expense: when "conservative" bankers make profits, they get the benefits; when they are hurt, we pay the costs.
Once again, recall the story of banks hiding explosive risks in their portfolios. It is not a good idea to trust corporations with matters such as rare events because the performance of these executives is not observable on a short-term basis, and they will game the system by showing good performance so they can get their yearly bonus. The Achilles’ heel of capitalism is that if you make corporations compete, it is sometimes the one that is most exposed to the negative Black Swan that will appear to be the most fit for survival.
source: http://www.fooledbyrandomness.com/imbeciles.htm
Replies from: gjm↑ comment by gjm · 2017-02-09T11:53:56.093Z · LW(p) · GW(p)
The saying goes that economists have successfully predicted nine of the last five recessions.
In that passage, Taleb correctly identifies some systemic problems that led to the 2008 crash. That's not quite the same thing as predicting the crash. (If I say "San Francisco is built on a big fault, and it hasn't had a really big earthquake for some time." do I get to claim I predicted the Next Big One, if and when it happens?)
So (1) the impressiveness of that passage depends on how unusual it was to recognize those problems. I don't know the answer to that; perhaps scarcely anyone else did, or perhaps most of the people with any information did but most of them found it expedient not to say so, or perhaps actually lots of people said so and we didn't listen.
And (2) whether or not we take this as a prediction, and taking him at his word that he predicted Brexit and Trump and whatever else he said he predicted: what else has he predicted, and how did the other predictions turn out?
For instance: perhaps all his recent predictions have been of the form "populism is in the ascendant". If so, then that would lead to correct Brexit and Trump predictions, and maybe some others, since populism is indeed doing well. If he noticed that before everyone else, that's impressive. But it doesn't necessarily indicate that he'll predict well outside that domain.
Or perhaps he's predicted hundreds of things and most of them were wrong -- but he doesn't mention those. If so, saying "I predicted Brexit and Trump" would be entirely misleading and he deserves no credit for anything.
Or perhaps he's predicted hundreds of surprising things and got lots of them right, in which case we should indeed take seriously the idea that there's some fundamental thing he's doing right and others are doing wrong.
Anyone got more information about his actual predictions and how they've turned out?
Replies from: ChristianKl, maxjmartin↑ comment by ChristianKl · 2017-02-10T08:51:59.201Z · LW(p) · GW(p)
And (2) whether or not we take this as a prediction, and taking him at his word that he predicted Brexit and Trump and whatever else he said he predicted: what else has he predicted, and how did the other predictions turn out?
In the linked interview, it's not Taleb that says he predicted those things but the journalist.
Replies from: gjm↑ comment by gjm · 2017-02-10T14:05:11.524Z · LW(p) · GW(p)
Oh, you're right. My mistake; sorry about that.
(Though ... is it possible that the journalist wrote that because Taleb said "Make sure you mention that I predicted these things"?)
Replies from: ChristianKl↑ comment by ChristianKl · 2017-02-10T14:23:50.547Z · LW(p) · GW(p)
(Though ... is it possible that the journalist wrote that because Taleb said "Make sure you mention that I predicted these things"?)
It's possible but I don't think it's likely as it would be a low status move in the interaction. You don't tell a journalist what he's supposed to write. You rather provide him with the material he needs to write a story.
It's in the journalist's interest to present Taleb as a genius because that makes the interview more important. It makes it more likely that the article get's shared and get's pageviews. The fact that he secured an interview with a genius who often rejects interview requests from the media is also important for the relationship the journalist has with his editor.
Replies from: gjm↑ comment by gjm · 2017-02-10T15:43:22.755Z · LW(p) · GW(p)
Yeah, he probably wouldn't actually have said "Make sure you mention ...". But he might well have taken care to ensure that the journalist knows that he predicted those things.
(Where "knows" might be the wrong word, to whatever extent he didn't actually predict them.)
Replies from: ChristianKl↑ comment by ChristianKl · 2017-02-10T20:20:07.056Z · LW(p) · GW(p)
Let's imagine the journalist asked him: "Is it true that you predict the financial crash of 2008?"
Taleb might have answered: "In my book, I wrote that the financial system is likely to blow up. Various journalists described this as me predicting the financial system. I don't think that it's possible to predict the exact year when a system crashes."
Journalists are in the business of simplifying reality for their readers. It's quite likely that what Taleb told the journalist, in this case, is completely true but the journalist then simplified the complexity of the statement in something that's optimized to get the highest number of page views.
↑ comment by maxjmartin · 2017-02-09T15:24:41.344Z · LW(p) · GW(p)
Oh, I agree. He himself claims that many analysts were saying the same things, and that you would need to be an imbecile to not notice what was going on at the time. My feeling is that he himself does not feel that he made an impressive prediction, and was actually just pointing out that it would inevitably go wrong at some point. in fact he has been talking about the issue for long before 2008, it just happens that the publication of his book came at just the right time to make it look like he predicted it just before it happened. (I may be wrong, and am not very familiar with his recent predictions around Trump and the arguments with Nate Silver on that )
Mainstream media does not wish to explain concepts like calibration, and instead runs with articles along tine lines of "Taleb predicts crash! Is God/Superintelligence!" despite his books repeatedly hammering home the point that he aims to make bets with tiny chance of success and low downside, but huge upside.
He aims to bet £1 on horses the bookies have at 10,000:1 that he thinks actually have 2,000:1 odds.
I do not know if he has previously made predictions with attached probabilities that we could track, he would probably argue that his investments play this role.
( more details in my other comment http://lesswrong.com/lw/olj/interview_with_nassim_taleb_trump_makes_sense_to/dmr0 )
Replies from: gjm↑ comment by gjm · 2017-02-09T20:37:15.781Z · LW(p) · GW(p)
I don't have the impression that Taleb goes to any lengths to discourage people from thinking that he made more impressive predictions than he actually did :-).
Replies from: ChristianKl↑ comment by ChristianKl · 2017-02-10T09:07:27.411Z · LW(p) · GW(p)
He doesn't do so explicitly but he does so implicitly by saying that certain predictions can't be made reliably.