Posts

Comments

Comment by Autolykos on On Expressing Your Concerns · 2017-02-09T13:10:50.150Z · LW · GW

It's probably one of the many useful functions of the court jester :)

Comment by Autolykos on Asch's Conformity Experiment · 2017-02-09T13:01:52.229Z · LW · GW

Even a more sane and more continuously distributed measure could yield that result, depending on how you fit the scale. If you measure the likelihood of making a mistake (so zero would be a perfect driver, and one a rabid lemur), I expect the distribution to be hella skewed. Most people drive in a sane way most of the time. But it's the few reckless idiots you remember - and so does every single one of the thousand other drivers who had the misfortune to encounter them. It would not surprise me if driving mistakes followed more-or-less a Pareto distribution.

Comment by Autolykos on Ethical Injunctions · 2015-11-02T14:42:25.465Z · LW · GW

There probably was a time when killing Hitler had a significant chance of ending the war by enabling peace talks (allowing some high-ranking German generals/politicians to seize power while plausibly denying having wanted this outcome). The window might have been short, and probably a bit after '42, though. I'd guess any time between the Battle of Stalingrad (where Germany stopped winning) and the Battle of Kursk (which made Soviet victory inevitable) should've worked - everyone involved should rationally prefer white peace to the very real possibility of a bloody stalemate. Before, Germany would not accept. Afterwards, the Soviets wouldn't.

Comment by Autolykos on How to escape from your sandbox and from your hardware host · 2015-08-05T09:36:05.228Z · LW · GW

Yup. Layer 8 issues are a lot harder to prevent than even Layer 1 issues :)

Comment by Autolykos on How to escape from your sandbox and from your hardware host · 2015-08-05T09:31:37.792Z · LW · GW

While air gaps are probably the closest thing to actual computer security I can imagine, even that didn't work out so well for the guys at Natanz... And once you have systems on both sides of the air gap infected, you can even use esoteric techniques like ultrasound from the internal speaker to open up a low bandwith connection to the outside.

Comment by Autolykos on The Unfriendly Superintelligence next door · 2015-07-28T12:27:20.996Z · LW · GW

And some people would like to make it sit down and write "I will not conjure up what I can't control" a thousand times for this. But I, for one, welcome our efficient market overlords!

Comment by Autolykos on Absolute denial for atheists · 2015-07-23T14:37:18.891Z · LW · GW

Where did you get the impression that European countries do this on a large enough scale to matter*? There are separate bike roads in some cities, but they tend to end abruptly and lead straight into traffic at places where nobody expects cyclists to appear or show similar acts of genius in their design. If you photograph just the right sections, they definitely look neat. But integrating car and bike traffic in a crowded city is a non-trivial problem; especially in Europe where roads tend to follow winding goat paths from the Dark Ages and are way too narrow for today's traffic levels already.

While the plural of anecdote is not data, two of my friends suffered serious head trauma in a bicycle accident they never fully recovered from (without a helmet, they'd likely be dead), while nobody I know personally ever was in a severe car accident. And quick search also seems to indicate that cycling is about as dangerous as driving (with both of them paling by comparison to motorcycles...).

*with the possible exception of the Netherlands, but even for them I'm not sure.

Comment by Autolykos on How my social skills went from horrible to mediocre · 2015-06-01T10:01:27.437Z · LW · GW

I know you intended your comment to be a little tongue-in-cheek, but it is actual energy, measured in Joules, we're talking about. Exerting willpower drains blood glucose levels.

I don't know of studies that indicate intraverts would drain glucose faster than extraverts when socializing, but that seems to be a pretty straightforward thing to measure, and I'd look forward to the results. At least, i can tell from personal experience that I need to exert willpower to stay in social situations (especially when there are lots of people close by or when it's loud), and I'm a hardcore intravert. Also, I can conclude from the observation that there are actually lots of people who like to go to these places, while very few people enjoy activities that force them to exert willpower, that not everyone feels about it the way I do.

Comment by Autolykos on Revisiting torture vs. dust specks · 2015-06-01T09:36:03.600Z · LW · GW

There's another argument I think you might have missed:

Utilitarism is about being optimal. Instinctive morality is about being failsafe.

Implicit in all decisions is a nonzero possibility that you are wrong. Once you take that into account, having some "hard" rules like not agreeing to torture here (or in other dilemmas), not pushing the fat guy on the tracks in the trolley problem, etc, can save you from making horrible mistakes at the cost of slightly suboptimal decisions. Which is, incidentally, how I would want a friendly AI to decide as well - losing a bit in the average case to prevent a really horrible worst case.

That rule alone would, of course, make you vulnerable to Pascal's Mugging. I think the way to go here is to have some threshold at which you round very low (or very high) probabilities off to zero (or one) when the difference is small against the probability of you being wrong. Not only will this protect you against getting your decisions hacked, it will also stop you from wasting computing power on improbable outcomes. This seems to be the reason why Pascal's Mugging usually fails on humans.

Both of these are necessary patches because we operate on opaque, faulty and potentially hostile hardware. One without the other is vulnerable to hacks and catastrophic failure modes, but both taken together are a pretty strong base for decisions that, so far, have served us humans pretty well. In two rules:

1) Ignore outcomes to which you assign a lower probability than to you being wrong/mistaken about the situation. 2) Ignore decisions with horrible worst case scenarios if there are options with a less horrible worst case and still acceptable acceptable average case.

When both of these apply to the same thing, or this process eliminates all options, you have a dilemma. Try to reduce your uncertainty about 1) and start looking for other options in 2). If that is impossible, shut up and do it anyway.

Comment by Autolykos on "Risk" means surprise · 2015-05-22T12:20:14.250Z · LW · GW

Exactly. Stocks are almost always better long-term investments than anything else (if mixed properly; single points of failure are stupid). The point of mixing in "slow" options like bonds or real estate is that it gives you something to take money out of when the stocks are low (and replenish it when the stocks are high). That may look suboptimal, but still beats the alternatives of borrowing money to live from or selling off stocks you expect to rise mid-term. The simulation probably does a poor job of reflecting that.

Comment by Autolykos on How my social skills went from horrible to mediocre · 2015-05-22T12:06:04.560Z · LW · GW

Intelligence is basically how quickly you learn from experience, so being smart should allow you to get to the same level with much less time put in (which seems to be what the OP is hinting at). I'd also expect diminishing returns, especially if you always socialize with the same (type of) people. At some point, each social group (or even every single person) becomes a skill of its own. Once your generic social skills are at an acceptable level, pick your specializations carefully. Life is too short to waste it on bad friends.

Comment by Autolykos on How my social skills went from horrible to mediocre · 2015-05-22T11:57:23.118Z · LW · GW

My thoughts exactly. The first commandment of multiclassing in 3rd is "Thou shalt not lose caster levels". Also, Wizards are easily the most OP base class, if played well. Multiclassing them into anything without wizard spell progression is just a waste.

OTOH, using gestalt rules to make a Wizard//Rogue isn't half bad, even if a little short on HP and proficiencies. I prefer Barbarian or even the much ridiculed Monk in place of the Rogue.

Comment by Autolykos on How my social skills went from horrible to mediocre · 2015-05-21T07:56:57.495Z · LW · GW

I suppose you already drew the obvious conclusion, but I still think it's worth spelling out:

The key to people liking you is making sure they feel good when you're around. Causality is secondary.

Comment by Autolykos on How my social skills went from horrible to mediocre · 2015-05-21T07:48:45.467Z · LW · GW

A quick google search found this:

Emma Chapman, Simon Baron-Cohen, Bonnie Auyeung, Rebecca Knickmeyer, Kevin Taylor & Gerald Hackett (2006) Fetal testosterone and empathy: Evidence from the Empathy Quotient (EQ) and the “Reading the Mind in the Eyes” Test, Social Neuroscience, 1:2, 135-148, http://dx.doi.org/10.1080/17470910600992239

I can't find a citation for the whole story right now, but as I remember it, it goes something like this: When the first wave of testosterone hits a male fetus, it kills off well over 80% of the brain cells responsible for empathy and reading emotions. Which is not as bad as it sounds, some of them do grow back. And then comes puberty...

Comment by Autolykos on How my social skills went from horrible to mediocre · 2015-05-20T15:35:13.674Z · LW · GW

Only say things that can be heard. If you can anticipate that you are too many inferential steps away, you should talk about something else. Which means in this case: Be patient and build their knowledge from the bottom, not from the top.

If you have already started and notice the problem too late, yeah, you're kinda screwed. The honest answer seems pretty rude, and not saying anything is worse. I'd probably try to salvage what I still can by saying something along the lines of "I know this is a complicated and confusing issue, and it takes a while to explain where I'm coming from*. I can point you to these resources if you're really interested in the matter." And not bring it up again unless they start it.

This allows you to drop a conversation that's going nowhere, while they can research it if they want to or ignore it if they don't while still saving face in both cases.

*Or, if it went really bad: "...and I suck at explaining." - taking the blame for the failed communication can defuse the sting of making them feel stupid.

Comment by Autolykos on How my social skills went from horrible to mediocre · 2015-05-20T13:27:29.373Z · LW · GW

There is also something else going on here, which I realized after learning about personality types, especially Jung's theories and the Myers-Briggs Type Indicator. One dimension separates along the primary mode of seeing the world (Sensing vs iNtuitive), with the former ones collecting individual facts and strictly following isolated rules, and the latter ones always looking for the generalized principle behind the facts and questioning the origin and sense of rules.

These two types have a lot of trouble understanding each others' way of thinking and frequently get into each others' hairs; e.g. S types tend to interpret N types questioning rules out of curiosity as a personal attack on their way of life (especially so if accurate), while N types tend to dismiss criticism by S types as small-minded bean counting and accuse them of missing the forest for the trees.

Now, there are roughly four to six times as many S types as N types around, and on top of that most weak cases of N types tend to hide it so as not to seem too weird. On the other hand, abstract topics (natural sciences, Less Wrong) tend to attract N types. From this baseline (and your description) I infer that you are also one of the aliens. You can't fundamentally alter your way of thinking to fit in (would you even want to?) - the best you can hope for is to find and befriend the other hidden aliens while trying to get along with the rest.

There's also a nice TED talk on the matter. Just google "Weirdos, Misfits and You". And you might like Eugene Ionesco's "Rhinoceros". It's usually taken as a metaphor for something else, but I still find that it hits the mark pretty well. It's also short and fun to read, so there's no good excuse not to.

Comment by Autolykos on Two Cult Koans · 2015-04-09T13:57:59.051Z · LW · GW

Then he asked the wrong question. Straight up asking "Ougi, why did you decide on a formal dress code when this apparently has no meaning for your teachings?" is a different question from "Does wearing robes make us a cult?", and shows a different understanding of what the robes mean. The answer would still be deliberately confusing and enigmatic, but that's kinda the whole point of a koan.

Comment by Autolykos on Debiasing as Non-Self-Destruction · 2013-07-04T21:29:27.262Z · LW · GW

Danger, wild speculation ahead: I'd assume it has something to do with the saying "Engineers can't lie." I can imagine constantly experiencing that doing things in violation with reality leads to failure, while at the same time hearing politicians lie pretty much every time they open their mouth and having them get elected again and again (or not failing in another way), to make quite a few of them seriously fed up with the current government in particular and humanity in general. Some less stable personalities might just want to watch the world burn at that point. Which should make them recruitable as terrorists, if you use the right sales pitch.