Posts
Comments
The intellect, as a means for the preservation of the individual, unfolds its chief powers in simulation; for this is the means by which the weaker, less robust individuals preserve themselves, since they are denied the chance of waging the struggle for existence with horns or the fangs of beasts of prey. In man this art of simulation reaches its peak: here deception, flattering, lying and cheating, talking behind the back, posing, living in borrowed splendor, being masked, the disguise of convention, acting a role before others and before oneself—in short, the constant fluttering around the single flame of vanity is so much the rule and the law that almost nothing is more incomprehensible than how an honest and pure urge for truth could make its appearance among men. They are deeply immersed in illusions and dream images; their eye glides only over the surface of things and sees "forms"; their feeling nowhere lead into truth, but contents itself with the reception of stimuli, playing, as it were, a game of blindman's buff on the backs of things.
Nietzsche, On Truth and Lie in an Extra-Moral Sense
MixedNuts's comment reminded me of a good resource for such techniques, and, indeed, for generally improving one's effectiveness at reading: How To Read A Book
It ain't what you don't know that gets you into trouble. It's what you know for sure that just ain't so.
-- Mark Twain
Clearly Dennett has his sources all mixed up.
- Solaris by Stanislaw Lem is probably one of my all time favourites.
- Anathem by Neal Stephenson is very good.
Voted up mainly for the Greg Egan recommendations.
But the problem is worse than that because "Sometimes, crows caw" actually does allow you to make predictions in the way "electricity!" does not.
The problem is even worse than that, because "Sometimes, crows caw" predicts both the hearing of a caw and the non-hearing of a caw. So it does not explain either (at least, based on the default model of scientific explanation).
If we go with "Crows always caw and only crows caw" (along with your extra premises regarding lungs, sound and ears etc), then we might end up with a different model of explanation, one which takes explanation to be showing that what happened had to happen.
The overall problem you seem to have is that neither of these kinds of explanation gives a causal story for the event (which is a third model for scientific explanations).
(I wrote an essay on these models of scientific explanation earlier in the year for a philosophy of science course which I could potentially edit and post if there's interest.)
Some good, early papers on explanation (i.e., ones which set the future debate going) are:
The Value of Laws: Explanation and Prediction (by Rudolf Carnap), Two Basic Types of Scientific Explanation, The Thesis of Structural Identity and Inductive-Statistical Explanation (all by Carl Hempel).
Huh, I thought there was a fair bit of evidence around showing that people perform basically just as badly on tests which exploit cognitive biases after being told about them as they do in a state of ignorance.
I found Drive Yourself Sane useful for similar reasons.
I've been meaning to take a stab at Korzybski's Science and Sanity (available on the interwebs, I believe) for a while, but I've heard it's fairly impenetrable.
It's a wonderful thing to be clever, and you should never think otherwise, and you should never stop being that way. But what you learn, as you get older, is that there are a few million other people in the world all trying to be clever at the same time, and whatever you do with your life will certainly be lost - swallowed up in the ocean - unless you are doing it with like-minded people who will remember your contributions and carry them forward. That is why the world is divided into tribes.
-- Neal Stephenson, The Diamond Age
I neglected to record from which character the quote came.
Rationality is highly correlated intelligence
According to research K.E. Stanovich, this is not the case:
Intelligence tests measure important things, but they do not assess the extent of rational thought. This might not be such a grave omission if intelligence were a strong predictor of rational thinking. But my research group found just the opposite: it is a mild predictor at best, and some rational thinking skills are totally dissociated from intelligence.
The classic example of riding a bicycle comes to mind. No amount of propositional knowledge will allow you to use a bike successfully on the first go. Theory about gyroscopic effects of wheels and so forth all comes to nothing until you hop on and try (and fail, repeatedly) to ride the damn thing.
Conversely, most people never realise the propositional knowledge that in order to steer the bike left, you must turn the handle bars right (at least initially and at high speeds). But they do it unconsciously nonetheless.
But once procedural knowledge is had, it also incorporates things like body memory and pure automatic habit, which, when observed in oneself, are just as likely to be rationalized after the fact as they are to be antecedently planned for sound reasons. It's also easy to forget the initial propositions about a mastered procedure.
I've also noticed this kind of thing in my martial arts training.
For instance, often times high level black belts will be incredibly successful at a particular technique but unable to explain the procedure they use (or at least, they'll be able to explain the basic procedure but not the specific detail that makes the difference). These details are often things the practitioner has learned unconsciously, and so are not propositional knowledge for them at all. Or they may be propositions taught long ago but forgotten (except in muscle memory).
The difference between a great practitioner and a great teacher is usually the ability to spot the difference that makes a difference.
This tendency can be used for good, though. As long as you're aware of the weakness, why not take advantage of it? Intentional self-priming, anchoring, rituals of all kinds can be repurposed.
Most of these bad Philosophers were encountered during the few classes I took to get a Philosophy minor.
Initially I thought you were talking about professional Philosophers, not students. This clears that up, but it would be better to refer to them as Philosophy students. Most people wouldn't call Science undergrads "Scientists".
My experience with Philosophy has been the opposite. Almost all the original writing we've read has been focused on how and why the original authors were wrong, and how modern theories address their errors. Admittedly, I've tailored my study to contain more History and Philosophy of Science than is usual, but I've found the same to be true of the standard Philosophy classes I've taken.
In summary, it probably varies from school to school and I don't think it's entirely fair to tar the whole field of Philosophy with the same brush.
I would guess that it's because comments are shorter and tend to express a single idea. Posts tend to have a series of ideas, which means a voter is less likely to think all of them are good/worthy of an upvote.
Thirded. I completed half of my degree in CS before switching to Philosophy. I'm finding it significantly more stimulating. I don't think I learned anything in my CS classes that I couldn't easily have taught myself (and had more fun doing so).
According to this post, doing so would be "against blog guidelines". The suggested approach is to do top-level book review posts. I haven't seen any of these yet, though.
That sorted it, thanks.
Having recently received a couple of Amazon gift certificates, I'm looking for recommendations of 'rationalist' books to buy. (It's a little difficult to separate the wheat from the chaff.)
I'm looking mainly for non-fiction that would be helpful on the road to rationality. Anything from general introductory type texts to more technical or math oriented stuff. I found this OB thread which has some recommendations, but I thought that:
- this could be a useful thread for beginners (and others) here
- the ability to vote on suggestions would provide extra information
So, if you have a book to recommend, please leave a comment. If you have more than one to recommend, make them separate comments so that each can be voted up/down individually.
Nothing terrible will happen to Wednesday if she deconverts
The terrible thing has already happened at this stage. Telling your children that lies are true (i.e., that Mormonism is true), when they have no better way of discerning the truth than simply believing what you say, is abusive and anti-moralistic. It is fundamentally destructive of a person's ability to cope with reality.
I have never heard a story of deconversion that was painless. Everyone I know who has deconverted from a religious upbringing has undergone large amounts of internal (and often external) anguish. Even after deconverting most have not been capable of severing ties to the destructive people who doomed them to this pain in the first place.
What do you do with the answer, though? I have a fair idea of why most of my procrastination occurs (if I leave something til the last minute and make a hash of it, I have a convenient excuse to protect my ego) but that has never seemed to help me actually overcome it.
I've always enjoyed Lewis Carroll's talk of maps:
"That's another thing we've learned from your Nation," said Mein Herr, "map-making. But we've carried it much further than you. What do you consider the largest map that would be really useful?"
"About six inches to the mile."
"Only six inches!" exclaimed Mein Herr. "We very soon got to six yards to the mile. Then we tried a hundred yards to the mile. And then came the grandest idea of all! We actually made a map of the country, on the scale of a mile to the mile!"
"Have you used it much?" I enquired.
"It has never been spread out, yet," said Mein Herr: "the farmers objected: they said it would cover the whole country, and shut out the sunlight! So we now use the country itself, as its own map, and I assure you it does nearly as well.
From Sylvie and Bruno Concluded by Lewis Carroll, first published in 1893.
I'm not confident I could do a good job of it. He proposes that most problems in relationships come from our mythologies about ourselves and others. In order to have good relationships, we have to be able to be honest about what's actually going on underneath those mythologies. Obviously this involves work on ourselves, and we should help our partner to do the same (not by trying to change them, but by assisting them in discovering what is actually going on for them). He calls his approach to this kind of communication the "Real-Time Relationship."
To quote from the book: "The Real-Time Relationship (RTR) is based on two core principles, designed to liberate both you and others in your communication with each other:
- Thoughts precede emotions.
- Honesty requires that we communicate our thoughts and feelings, not our conclusions."
For a shorter read on relationships, you might like to try his "On Truth: The Tyranny of Illusion". Be forewarned that, even if you disagree, you may find either book an uncomfortable read.
I've found the work of Stefan Molyneux to be very insightful with regards to this (his other work has also been pretty influential for me).
You can find his books for free here. I haven't actually read his book on this specific topic ("Real-Time Relationships: The Logic of Love") since I was following his podcasting and forums pretty closely while he was working up to writing it.
If you don't know about relative motion and inertia, then it does seem like the sun moves around the earth (even when you know, it still looks that way). Prior to the "Copernican" revolution, it was generally thought that our sense experience of everyday life was sufficient to expose the truth to us. Those two things combined make a major roadblock in establishing that the earth rotates.
Now we can fully appreciate that it doesn't even make sense to make an absolute statement either way. If earth is taken to be stationary, then the sun does move around it (interestingly, this was Tycho Brahe's solution to the problem of shifting to a helio-centric view.)
I find it hard to believe that you haven't thought about the following, but you haven't mentioned it so I will. Conventional wisdom says:
1) Being at a healthy weight/having a 'healthy lifestyle' will (accidents and terminal genetic disorders aside) result in you living a longer life. This means more time to work on FAI stuff.
2) Exercise and good diet tend to increase feelings of well being and energy levels. This means better/more effective work on FAI stuff.
Discounting physical health and concentrating on intellectual life seems to me to be a status symbol for many intellectuals. But I would think that spending time and mental energy on physical well being would give larger benefits, in the long term, to one's intellectual endeavours.
Blood chokes still take several minutes to effect brain damage/death. I find the idea of accidentally throttling someone to death fairly suspicious. Besides, if it was truly an accident then where does Browne's guilt come from? I don't think the story suggested it was an accident.
I've recently started reading a book on the changes which Zen meditation seems to cause on neurology and consciousness, authored by a neurologist. The premise seems to fit with what you're saying.
I've heard that some meditative states (as measured by brain wave patterns) can be induced through the use of devices employing flashing lights and audio interference at certain frequencies ("binaural beats"). I've never really spent the time to investigate it seriously and there seems to be a fair amount of new-agey crap surrounding the idea, but it may have some merit.
I was indeed thinking of the Mentats and Bene Gesserit. As you both point out, there was a significant mystical aspect to it. I suppose I was thinking more of the approach taken to mental training (within the world's internally consistent, but mystical, framework) rather than any specific techniques or events.
Mentats on the other hand have "minds developed to staggering heights of cognitive and analytical ability" (thanks Wikipedia) which would seem to fit the bill.
On the other hand, I suppose that neither of these instances are quite what Eliezer was after, as "you can't go out and do it at home".
The Dune series and Neal Stephenson's latest novel Anathem both come to mind. The Dune series includes a number of plot devices involving mental discipline (although it's all semi-mystical.) The world of Anathem, on the other hand, is split into two factions, one of which is specifically rationalist. It gets pretty philosophical and weird toward the end, but it mostly involves rationalist characters using math/science/etc to overcome the hurdles in their way. The world it describes sounds pretty similair to what I've read of Eliezer's Bayesian Conspiracy.