Posts

Comments

Comment by Aetherial on Welcome to Less Wrong! (July 2012) · 2013-02-23T23:14:00.880Z · LW · GW

Peter here,

I stumbled onto LW from a link on TvTropes about the AI Box experiment. Followed it to an explanation of Bayes' Theorem on Yudowsky.net 'cause I love statistics (the rage I felt knowing that not one of my three statistics teachers ever mentioned Bayes was an unusual experience).

I worked my way through the sequences and was finally inspired to comment on Epistemic Viciousness and some of the insanity in the martial arts world. If your goal is to protect yourself from violence, martial arts is more likely to get you hurt or thrown in jail.

It seems inappropriate that I went by Truth_Seeker before discovering this site, so a chose a handle that was in opposition to that. And I like the word aether.

Comment by Aetherial on Epistemic Viciousness · 2013-02-23T19:31:03.659Z · LW · GW

One of the questions this article asks is "How can Rationality and the people that want to learn about it avoid Epistemic Viciousness?" I feel as though many take martial arts because of a desire to learn to defend themselves and feel prepared for violence, and dojos are all too happy to sell that without giving any real knowledge of self-defense. Even on this page's comments the idea that certain schools have more utility because they were tested in a "realistic" environment is bandied about like that will help you not get mugged.

I feel that one of the best utilities of Rationality is self-optimization, but that isn't what drew me to LW (TvTropes link about the AI box experiment). Rationalists can avoid epistemic viciousness by not being afraid to explore both how rationality can improve our lives as well as where rationality just doesn't have enough utility to justify the expense of learning. We can be better than MA by not selling rationality to people who want to use it for some low utility (like winning arguments against theists). Why would the layperson explore rationality? Or, if we want to concentrate on the LW demographic, what do LW'ers expect out of listening to Eliezer Yudowsky's blogs? Though self-optimization is one of the higher-utility benefits of rationality, I've stuck around because I'm fascinated by this "save the world" idea, not because I plan to dedicate myself to undergoing a Bayesian Enlightenment.

Comment by Aetherial on Epistemic Viciousness · 2013-02-23T09:44:55.499Z · LW · GW

I was disappointed in my dojos because I went there to learn self defense and psychological survival but only learned about punches and kicks and heard promises of eventually knowing enough to "win" in a fight.

One of the ways they measure how much "better" you are is by having you punch or kick easily breakable wooden boards. Three or more is impressive but broken boards neither prepares you for "winning" a fight or knowing the self-defense techniques involved in preventing or de-escalating a potential fight. Yet, it feels pretty darn good to break those boards.

Martial Arts exists as one extremely unlikely and limited-use scenario of self defense. Against most of your peers, some MA knowledge will help you not get your ass kicked. Against people who employ violence for a living, MA skills are more likely to make you either a corpse or involved in a legal dispute and likely sporting a wonderful case of PTSD (whether you win or lose the fight, you still lose).

Self defense isn't about finding the "best" martial art. If rationality is going to have true value, teaching it isn't mainly going to be for some lower utility such as fighting the theists.