Posts
Comments
Twitter account: http://twitter.com/plntrationalist
This is "inattention blindness". Choice blindness is sort of like the opposite; in inattention blindness you don't notice something you're not paying attention to, in choice blindness you don't notice something which you are paying attention to.
The video shows the mechanics of how it works pretty well.
What's the name of the principle that variance increases further from 50%?
75% choose program A
I, for one, really like it.
Glad you like it. There are zillions more where that came from
What if I want to write, then sell it? Something that might be achievable could be like what Skeptic's Dictionary or You Are Not So Smart did, they started out as websites that slowly filled out and were ultimately published as books.
(Why isn't there a Singularity Institute Press?)
Vastly, vastly more likely.
Everyone once in awhile someone sends me a link to an article on wikipedia saying I would find it interesting... and as a matter of fact, I found it especially interesting: I wrote it!
Or, I added a quote to Daniel Kahneman's page that has since appeared in almost every bio of Kahneman that I've seen since. For example, David Brooks wrote a column on Kahneman a few months ago and used the same exact quote I added, so that's millions of people indirectly.
Boggles the mind, really.
Criticism is totally fair. I was getting frustrated with it, so I decided to get something done quickly that I could replace later. So, there are flaws.
It's supposed to stop cycling if you mouseover it.
yup, that's mine too
I wrote all of them
Heuristics in Heuristics and Biases are only descriptive. [...] Heuristics in Heuristics and biases are defined as having negative side effects.
If your claim is that heuristics are defined by H&B theorists as being explicitly not prescriptive, in the sense of never being "good" or "useful," this is simply not the case.
No, no, that's not what I'm saying. The claim that heuristics have negative side effects does not entail a claim that negative side effects are the only characteristics they have. The 'side effect' terminology might be taken to imply that there is a main effect which is not necessarily negative.
They have always claimed that heuristics are right most of the time. But they wouldn't recommend you purposefully try to "use" them. They only propose heuristics that could theoretically explain empirically observed biases. F&F heuristics do not necessarily need to explain biases. A F&F heuristic might only explain when you get something right that you otherwise shouldn't. I'm not even sure that an F&F heuristic need explain anything empirically observed but rather could be a decision strategy that they modelled as being effective that everyone should learn (what I clumsily meant by 'prescriptive'). And they have published ways to teach use of some of their heuristics.
Representativeness, one of the earliest examples of a heuristic given by the H&B program, is certainly used in a conscious and deliberate way. When asked, subjects routinely report relying on representativeness to make frequency or probability judgments, and they generally see nothing wrong or even really remarkable about this fact.
I don't recall introspective interviews with subjects taking place in H&B research, though I may apparently be wrong about that. What I had in mind when I wrote that was that I seem to recall K & T and Gigerenzer sparring over the validity of doing that.
Except.... now that I think of it I seem to recall something like that in the really early K & T papers... maybe as I understood it, which may be obsolete, is that introspection could be useful to help generate empirical theories but could not be used to validate them whereas I seem to recall Gigerenzer arguing that they could provide validity. Maybe the camps have converged on that, or my memory continues to be faulty.
[irrelevant digression: representativeness was the absolute earliest, and by a large margin if you include "the law of small numbers" as the germ of representativeness. But if you count the law of small numbers as a heuristic and separately then it was the first.]
Nick Epley's work also strongly suggests that people very deliberately rely on anchoring-and-adjustment strategies when making some common judgments (e.g., "When was George Washington elected president?" "Hmm, well it was obviously some time shortly after the Declaration of Independence, which was in 1776... so maybe 1786?").
It implies that anchoring-and-adjustment is consciously available as a strategy at least some of the time.
When it theoretically appears in the anchoring bias ("Are there more or less than 60 nations in the UN from Africa?") it's virtually impossible to debias, suggesting it's outside of conscious control in that case.
So it does force the concession that it's not always true, though.
Fast and Frugal heuristics, however, you can learn and use intentionally.
One can certainly learn to use any heuristic strategy, but for some heuristics proposed by the F&F camp, such as the so-called fluency heuristic (Hertwig et al., 2008), it is not at all obvious that in practice they are utilized in any intentional way, or even that subjects are aware of using them. ...
Wasn't aware of that one. I haven't kept up with the literature since 2005 or so. If there are some F&F heuristics that are outside of conscious awareness and some H&B heuristics that are within awareness then conscious awareness is eliminated as a possible distinction.
There are some F&F heuristics that they argue we should use more than we already would. I'm not sure if there are any H&B heuristics for which that would be true.
Descriptive F&F heuristics aren't evolutionary quirks.
I'm not sure what you mean here. If an "evolutionary quirk" is a locally optimal solution that falls short of a global maximum...
I mean like a dead-end local maxima that we could be "stuck" in but doesn't hurt us that much. We would have better vision if we didn't all have a little blind spot. There's no reason for it being there, invertebrates that have highly developed eyes don't have it. But we're stuck with it since it goes back to the way the first vertebrates. I don't think an H&B theorist would object to the idea of evolutionary "mistakes" as an explanation whereas I think an F&F theorist very well might. Maybe that's not a very good a distinction.
I do not think that F&F theorists think that their heuristics are globally optimal, something that was globally optimal would no longer be a heuristic of any stripe.
[edit: I think I see where I was going wrong here. H&B theorists study biases that are not necessarily theoretically caused by heuristics. For instance, prospect theory isn't a heuristic. Or, framing isn't caused by any heuristic that I can think of. But it's orthogonal to their definition of what a heuristic is.]
...besides the obvious that Fast and Frugal heuristics are "good" while heuristics as in Heuristics and biases are "bad".
This impression is entirely due to differences in the framing and emphasis employed by the two camps. It does not represent anything like a fundamental distinction between how they each view the nature or role of heuristics in judgment and decision making.
I meant those as scare quotes, meaning I don't necessarily endorse them. I agree that framing and emphasis is a very large part of the difference between the camps. I'm not 100% convinced it is entirely the difference.
I think there may be still the issue that a heuristic in F&F can be something that they modelled which is not empirically used, or at least not empirically seen as much as it should be optimally, but it would be good if we could be taught to use it whereas I don't think that an H&B heuristic would ever have that set of characteristics. though perhaps you could convince me otherwise.
Fast and Frugal heuristics can be descriptive (meaning human beings naturally use them at some level) or prescriptive (here are some good heuristics you can learn to use). Heuristics in Heuristics and Biases are only descriptive.
The Heuristics and Biases theorists would never suggest someone should try to "use" one of their heuristics, nor probably could you even if you tried. You could not intentionally reproduce the pattern of cognitive biases that their heuristics allegedly cause, many appear to be irretrievably outside of conscious awareness or control. For that matter, they often appear to be nearly impossible to stop using even if you wanted to.
Fast and Frugal heuristics, however, you can learn and use intentionally. The Fast and Frugal theorists generally don't suggest that it would be difficult to stop using their heuristics should you be aware of them and have the desire to. Descriptive heuristics may even be discoverable via introspection.
Heuristics in Heuristics and biases are defined as having negative side effects. There are no heuristics in H&B that aren't revealed via errors. Heuristics in H&B are presumed to be either needed by some necessary efficiency or could be an evolutionary quirk like the blind spot in your eye. Fast and Frugal heuristics do not require negative side effects and are usually not described with any. Descriptive F&F heuristics aren't evolutionary quirks. Heuristics in F&F are defined as being a helpful efficiency gain.
So they are mutually exclusive in some properties, besides the obvious that Fast and Frugal heuristics are "good" while heuristics as in Heuristics and biases are "bad".
That's a valid opinion. There is only a subtle difference really so maybe it's not the best example
Check out Paul Ekman's books
I've re-instated twitter so far. The issues are: general visual clutter, I found a way to mitigate this issue by using a trick to force lower the visual contrast of the buttons, and that these social buttons often really slow down the loading of the page, especially if you want the dynamic share/like/retweet counters for every item. I might leave the counter on twitter but omit it for the others and see what the page load is like.
I'm not sure what email-sharing service to use... facebook has one in its "share" button, there are probably others.
I've thought of ways of working around this. There are ways of actually defeating the truncation. One issue is that there isn't necessarily an obvious programmatic way of telling which feeds are truncated and which aren't.
For now, try out this feed proxy: http://andrewtrusty.appspot.com/readability/ , e.g. http://andrewtrusty.appspot.com/readability/feed?url=http%3A//feeds.feedburner.com/planetrationalist
Hmm, I was wondering how much people used those things. Do you want just twitter + email? Facebook?
To be completely honest, I wasn't going on a strict definition of the term rationalist; frankly I consider the term kind of flawed anyway. But I don't have a better replacement in mind. For me it means being interested in being rational, being interested in how the mind works, being interested in cognitive biases, Bayes' rule, probability, statistics, logical fallacies, and scientific self-improvement.
- I selected the sources starting with lesswrong and overcoming bias, then taking suggestions from people, doing some rudimentary graph analysis, manually adding blogs of authors in related fields, watching what sources I selected linked to themselves.
- I tried to include sources that were readable but not gimmicky (e.g. top 7 secret tips to supercharge your goals!!!). Sometimes sources vary outside this interval, and I don't have any filtration sophisticated enough to handle this.
- I selected against sources that posted too frequently, anything political, anything that seemed angry or upbraiding or read like a manifesto. I included some sources which include these but against which I was able to filter out the political etc. posts easily. The rudimentary methods I used to filter topics doesn't work perfectly, though.
- I tried to include a few sources from less closely related subjects that were high quality and don't seem to post that frequently. For instance, I included only a couple skeptic blogs, but there are tons and tons of them out there and I feel that it's a different niche that's already addressed pretty well elsewhere. Some fields I avoided almost entirely like entrepreneurship or economics.
- I tried to not let any one subject dominate the set of sources. I feel like I included too many psychology blogs, for example.
Options for now:
- create a greasemonkey script to hide posts from the sources you don't want. Every source has a unique CSS class so it should be trivial.
- create a yahoo pipe to filter the sources you don't want through the rss feed and read it through a feed reader
- clone the set of sources using the OPML feed in your feed reader of choice and add/remove whatever you want from the source list. However, this will not be kept in sync in the likely event that the official set of sources should change.
Out of curiosity, what don't you want to see and why?
Meaning you want to turn some sources off?
Yeah I couldn't think of one.
Favicon contest?
May not have been just you, I suspect my ISP was having problems earlier.
I agree that some kind of filtering (human or machine) could provide additional value, but at this stage I want to see how well the most rudimentary version of the idea works for people before investing further.
Thanks a lot!
I will limit aggregation to the Critical Thinking category as you suggest.
Thanks, added.
- http://www.acceleratingfuture.com/michael/blog
- http://www.aleph.se/andart/
- http://www.badscience.net
- http://www.bayesianinvestor.com/blog
- http://bayesianstats.com
- http://becomeunrestricted.com
- http://brainethics.org
- http://chronopause.com
- http://cogsciblog.wordpress.com
- http://commonsenseatheism.com
- http://danariely.com
- http://www.randomhouse.com/kvpa/gilbert/blog/
- http://www.georgesaines.com
- http://www.decisionsciencenews.com
- http://realdoctorstu.com
- http://dresdencodak.com
- http://evolvingthoughts.net
- http://experimentalphilosophy.typepad.com/experimental_philosophy/
- http://www.fhi.ox.ac.uk/home
- http://www.fallacyfiles.org/
- http://www.gnxp.com/wp
- http://www.greatplay.net
- http://hplusmagazine.com
- http://judgmentmisguided.blogspot.com/
- http://www.kurzweilai.net
- http://lesswrong.com/
- http://litemind.com
- http://measureofdoubt.com
- http://metamodern.com
- http://meteuphoric.blogspot.com/
- http://mindhacks.com
- http://blog.okcupid.com
- http://www.oliverburkeman.com
- http://www.overcomingbias.com
- http://www.pointofinquiry.org
- http://www.spring.org.uk
- http://psysociety.wordpress.com
- http://psych-your-mind.blogspot.com/
- http://www.rationaloptimist.com/blog
- http://rationallyspeaking.blogspot.com/
- http://reducing-suffering.blogspot.com/
- http://regardingwork.com/
- http://sciencethatmatters.com
- http://www.sentientdevelopments.com/
- http://www.setsights.co.uk
- http://www.scottaaronson.com/blog
- http://www.simoleonsense.com
- http://www.skepticblog.org
- http://www.spencergreenberg.com
- http://andrewgelman.com
- http://suegardner.org
- http://www.bulletproofexec.com
- http://blog.givewell.org
- http://singinst.org/blog
- http://www.singularitysummit.com/blog
- http://thesituationist.wordpress.com
- http://thethinkerblog.com
- http://www.michaelshermer.com
- http://tar.weatherson.org
- http://whywereason.wordpress.com
- http://youarenotsosmart.com
- http://dirtsimple.org/index.html
- http://www.eharmony.com/labs
- http://www.ribbonfarm.com
- http://jsomers.net/blog
Thanks there are some good ones there.
If it's not strictly related but likely of interest to the same people that are interested in rationality (e.g. credible self-improvement) then it's better if posts aren't that frequently. For instance, there's a lot of good stuff in hacker news but there's 100+ front-page posts per day and it would drown everything else out.
If the aggregator proves popular I could introduce a text-classification filter to try to only include relevant posts from sources with varied content, but I'm only willing to invest time in that if it turns out that people are responsive to the aggregator in its current simpler form.
Yes, I'll do that. I've been looking for places to announce it/request feedback.
I'm working on a rationality blog aggregator, and should be ready to make it public in the next few days. Would you like to know when it is released?
I can sort of see how a woman might find such a thing just a tad creepy.
In many cases perhaps the appropriate action would be raise this woman's consciousness: men's sexuality isn't necessarily scary or threatening.
I don't necessarily agree with Nussbaum, I just thought it was interesting and related.
There is ample stuff that's perhaps more empirical
Is it out of bounds to consider plain and simple prejudice as the trigger?
Disgust reactions are frequently based on prejudices that should be challenged and rebutted. People frequently describe male sexuality in strikingly similar ways to how prejudiced people describe (typically male) homosexuality. You know, it's disgusting, it's ridiculous, it's wrong in some indescribable way, it's threatening and dangerous in some abstract, unfalsifiable sense. Except it's not taboo to talk about male heterosexuality that way. Men are pigs, after all, and that they want to have sex is ridiculous and wrong ipso facto. We should question and challenge rather than try to rationalize these impulses. Maybe the validity of this kind of reaction shouldn't be automatically assumed. Maybe the icky wrongness is hard to articulate because you're trying to implausibly rationalize a slippery gut reaction, not trying to describe an elusive actual moral principle.
Here's an interesting interview with Martha Nussbaum on related topics: http://www.reason.com/news/show/33316.html