Posts
Comments
...there are potential Problems when you arrange the plot so that...
True conflict strengthens narrative. But then, you're not really complaining about creating problems for your characters.
Perhaps I phrased my template too formally. Though as I search for examples, I notice that different uses of the word "guy" would require various replacements ("person," "someone," or "the one") in order to sound natural.
Really, I begin to think it would be simpler to alter our culture so that nobody expects "guy" to imply "male".
A form like "the one who acts" sounds perfectly natural to me.
Consider me to be among those who would establish an Indianapolis meetup.
I don't suppose there's a highly-accessible curated database of hypotheses which appear to have tested very differently between mice (or other subjects) and humans. Suddenly this strikes me as a highly valuable resource.
Now I'm wondering if there's a way to make that the start of a viable business, but of course my pondering is limited by knowledge outside my domain.
Doesn't this plan seem rather risky if the primary benefits are so limited?
On the other hand, now Quirrell has a way to convince Harry to help him get the Philosopher's Stone, or to consider leaving Hogwarts in spite of the danger to help him with a "life-saving ritual".
On the other other hand, telling Harry about these life-saving methods could just make him angry that no one mentioned them with respect to Hermione.
Who says he's blind? He won't so much as drink from his own containers in Quirrell's presence because Quirrell might teleport something nasty inside. And even if he decided that Quirrell was totally irredeemable, Harry should still be upset about losing the enjoyable aspects of Quirrell's personality.
I suggest that you research the difference between instrumental values and terminal values.
In the situation you describe, the settlement is weak evidence for the product not working. Weak evidence is still evidence. The flaw in "Absence of evidence is evidence of absence," is that the saying omits the detailed description of how to correctly weight the evidence, but this omission does not make the simple statement untrue.
We simply don't have the time and computing power to use full rigor on our individual decisions, so we need an alternative strategy. As I understand it, the human brain operates largely on caching. X-rationality allows us to clean and maintain our caches more thoroughly than does traditional rationality. At first, it seems reasonable to expect this to yield higher success rates.
However, our intuition vastly underestimates the size of our personal caches. Furthermore, traditional rationality is simply faster at cleaning, even if it leaves a lot of junk behind. So it would appear that we should do most of the work with traditional rationality, then apply the slower x-rationality process for subtle refinement. But since x-rationality is so much slower and more difficult to run, it takes a whole giant heap of time and effort to get through a significant portion of the cache, and along the way many potential corrections will have already been achieved in the traditional rationality first pass.
But if we leave out the more rigorous methods entirely, deeming them too expensive, we're doomed to hit a pitfall where traditional rationality will not save us from thirty years of pursuing a bad idea. If we can notice these pitfalls quickly, we can apply the slow x-rationality process to that part of the cache right away, and we might only pursue the bad idea for thirty minutes instead.
We need to be able to reason clearly, to identify opportunities for clearer reasoning, and to identify our own terminal goals. A flaw in any of these pieces can limit our effectiveness, in addition to the limits of just being human. What other limiting factors might there be? What methods can we use to improve them? I keep coming back to Less Wrong because I imagine this is the most likely site to provide me with discourse on the matter.
Human brains do experience in-group reinforcement, so we ought to aim that reinforcement at something like truth-seeking, which tends to encourage meta-level discussions like this one, thus helping to immunize us against death spirals. Note that this requires additional techniques of rationality to be effective. Consider that some truths--like knowing about biases--will hurt most people.
It is possible to avoid peer pressure for an entire childhood. Even to overdo it to irrational levels, which I managed to accomplish, mainly by sheer egomania. An addiction to defiance can be helpful, but reining it in more so.
We hardly have enough evidence after just one attempt. Additionally, subtlety is a form of deceit, and not generally encouraged in rational discussion.
Anyway, my complaint would have to be toward the out-of-hand dismissal of yaro's post, rather than offering a substantive disagreement or at least a link regarding the perceived flaws in yaro's argument. That's proper rationalist encouragement. No subtlety required.
Excellent point. I suppose for some, the many shortcomings of their religion are enough to overthrow any intellectual authority that religion may have held over them. This does grant such individuals more freedom to evaluate the remainder of their beliefs. I do hold such freedom in high regard. "Your religion is demonstrably not a scientific authority. If some of it is wrong, it cannot all be the untarnished word of a supreme being. How then can it justify authority in other areas?"
There is, however, a certain temptation among those first realizing their own intellectual freedom from religion. It is a temptation to ardently maintain the language and customs and non-falsifiable beliefs from the religion they have otherwise abandoned. A simple stroll along the path of minimal required change. While there are many sub-optimal paths to optimizing one's own reasoning capacity, I have personal associations which make this path particularly worrisome.
I wonder if there are methods to help others avoid this baggage-claim stage entirely, or if the religious baggage really does provide some utility for social interaction. I fear any utility it provides the holder will be at the cost of increased perceived support toward those who use that same religion as a justification for various kinds of oppression. I guess the whole problem comes back to in-group solidarity, pros and cons alike. Pro-baggage: I get to stay in my group. Con-baggage: Some members of that group are against various forms of freedom and reason.
There is a peculiarity of religions that causes them to attract this sort of scrutiny. Religions are meant to be treated as package deals, as if claims about the efficacy of eating shrimp have some special correspondence to favoritism toward heterosexuality and premarital abstinence. As if the latter two things have any special correspondence! There's no reason subscribing to some "core" values of a religion should require someone to accept the whole subscription. Seldom are a religion's "core" values enough to reconstruct the rest of the religious system, or even anything vaguely similar.
It's such a glaring fallacy, yet oddly it even sucks in religion's detractors. As if we could demolish the entirety of a poorly-connected religion just by overturning a few of its claims.
Yet another result of this aspect of religion is the tendency for a shift in beliefs to require the creation of entire new sects, such as the Unitarians.
No, the folks who give me the most pause are the Indie-Christians. They take whatever beliefs they like from wherever they like (but usually with a focus on scientific anticipation-constraining beliefs and Christian non-constraining beliefs) and run with them. As far as I can tell, they're doing it right, but winding up with far more intellectual baggage than I'd be willing to carry. Of course, I can't talk them out of anything, because their only falsifiable beliefs are the reasonable non-spiritual ones, and their ability to interact smoothly with less reasonable Christians gives them more utility than would my Occam approach.