Human factors research seems very relevant to rationality
post by casebash · 2015-06-21T12:55:35.479Z · LW · GW · Legacy · 5 commentsContents
5 comments
"The science of “human factors” now permeates the aviation industry. It includes a sophisticated understanding of the kinds of mistakes that even experts make under stress. So when Martin Bromiley read the Harmer report, an incomprehensible event suddenly made sense to him. “I thought, this is classic human factors stuff. Fixation error, time perception, hierarchy.”
experienced professionals are prone..."
5 comments
Comments sorted by top scores.
comment by Shmi (shminux) · 2015-06-21T17:13:21.803Z · LW(p) · GW(p)
The links missing from the OP, for the lazy: Flight 173, the article being quoted,
comment by jacob_cannell · 2015-06-21T17:15:53.058Z · LW(p) · GW(p)
When a plane reports a problem, I assume that there is a bunch of data being sent out to FAA controllers, and that basically the plane suddenly shows up as a big problem in the control room.
So basically I am wondering why it is the crew's responsibility to fix any problem that shows up? I mean it would seem that these kind of fast diagnosis problems are best solved by a control room with a bunch of engineers, AI software, etc. The crew's role should be minimal (I'm of course assuming good network connectivity, but that seems reasonable at this point).
comment by SilentCal · 2015-06-22T22:03:32.558Z · LW(p) · GW(p)
I don't recall fixation error in the sequences, but something like it shows up in HPMOR, occurring in http://hpmor.com/chapter/67 and discussed in http://hpmor.com/chapter/68:
"But you, Miss Granger, had the misfortune to remember how to cast the Stunning Hex, and so you did not search your excellent memory for a dozen easier spells that might have proved efficacious."
comment by XFrequentist · 2015-06-26T19:20:53.678Z · LW(p) · GW(p)
I've delved into this literature a bit while researching a (currently shelved) paper on automation-associated error, and I agree with the title of this post!
comment by buybuydandavis · 2015-06-22T04:19:45.469Z · LW(p) · GW(p)
This is called “fixation error”.In a crisis, the brain’s perceptual field narrows and shortens. We become seized by a tremendous compulsion to fix on the problem we think we can solve, and quickly lose awareness of almost everything else.
Is there a less time critical version of that?
I've always felt rather obsessed with whatever I considered the top problem on the list.