Open Thread: August 2011

post by komponisto · 2011-08-03T02:48:24.254Z · LW · GW · Legacy · 34 comments

Contents

34 comments

For miscellaneous discussions and remarks not suitable for top-level posts even in the Discussion section, let alone in Main.

(Naturally, if a discussion gets too unwieldy, celebrate by turning it into a top-level post, just like in the good old days.)

34 comments

Comments sorted by top scores.

comment by komponisto · 2011-08-03T02:51:58.956Z · LW(p) · GW(p)

Trivial Inconvenience Alert: I just realized that I seldom browse the comment feeds anymore, since it now requires an extra click.

Replies from: RobertLumley, Sniffnoy, Oscar_Cunningham
comment by RobertLumley · 2011-08-03T13:03:42.079Z · LW(p) · GW(p)

I feel that way about everything. I wish there were a feed that would show all threads and not delineate between Main and Discussion.

Replies from: AdeleneDawner
comment by AdeleneDawner · 2011-08-03T13:10:14.143Z · LW(p) · GW(p)

http://lesswrong.com/r/all/ does that for posts, but it looks like http://lesswrong.com/r/all/comments just shows comments from the main section. Perhaps the latter behavior should be reported as a bug? In the meantime, if you want an RSS feed, combining two rss feeds into one is pretty trivial with yahoo pipes.

Replies from: RobertLumley, Document
comment by RobertLumley · 2011-08-03T13:18:56.924Z · LW(p) · GW(p)

Thanks. That's exactly what I wanted. Well, I'd prefer they just have the title, but I can't be took picky.

comment by Sniffnoy · 2011-08-04T01:12:03.477Z · LW(p) · GW(p)

Note that you can also get to the comment feed by clicking on "Recent Comments"; unfortunately this still requires scrolling down.

comment by Oscar_Cunningham · 2011-08-03T08:26:04.808Z · LW(p) · GW(p)

Bookmark them?

Replies from: MixedNuts
comment by MixedNuts · 2011-08-03T08:41:42.462Z · LW(p) · GW(p)

Would need to be done by every user on all repeatedly-used computers (impossible on ones used once or aggressively cleaned); generalising leads to many unmanageable bookmarks.

comment by kpreid · 2011-08-06T03:34:09.198Z · LW(p) · GW(p)

Possibly interesting article on winning: How to seem good at everything: Stop doing stupid shit

Summary, as I interpreted it: In practicing a skill, focus on increasing the minimum of the quality of the individual actions comprising performing the skill.

(Is this worth a /r/discussion link-post?)

Replies from: Pavitra
comment by Pavitra · 2011-08-30T16:39:16.032Z · LW(p) · GW(p)

Yes.

comment by Pavitra · 2011-08-03T18:43:01.367Z · LW(p) · GW(p)

Silly hypothesis: all reductionists are actually p-zombies. Lacking qualia, and generalizing from one example, we assume incorrectly that "qualia" must refer to something that our own brains do, and wind up constructing theories that try to explain qualia in terms of material processes.

Replies from: Armok_GoB, Document
comment by Armok_GoB · 2011-08-05T18:03:10.969Z · LW(p) · GW(p)

That's strictly speaking nonsense given the definition of P-zombies, since it'd be a detectable difference. But the thing you actually mean is interesting. We know it's false due to all the specific things we know about the brain, but 50 years ago I might have taken it seriously.

comment by Document · 2011-08-05T04:55:16.987Z · LW(p) · GW(p)

I remember that being proposed seriously by a commenter during the Sequences.

comment by beriukay · 2011-08-17T14:15:17.882Z · LW(p) · GW(p)

I'm reading Mere Christianity, and boy howdy is it a hair-puller. It made me so mad that on about page 90 he was talking about donating to charity until it hurts, which reminded me of the current SIAI fundraiser. I know that donating money out of spite isn't exactly the healthiest of actions, but I've got $1000 that says fuck him.

Replies from: steven0461, Eugine_Nier, Morendil
comment by steven0461 · 2011-08-17T19:33:04.779Z · LW(p) · GW(p)

"SingInst 2011 Summer Challenge: Give until it hurts the ghost of CS Lewis!"

comment by Eugine_Nier · 2011-08-31T04:31:41.887Z · LW(p) · GW(p)

What exactly about it makes you angry?

Replies from: beriukay
comment by beriukay · 2011-08-31T09:10:13.315Z · LW(p) · GW(p)

I'm having trouble explaining myself, so maybe an example of Lewis' text with an approximation of my response at the time will suffice. This clip was chosen because it was the last straw that prompted me to write an email to several friends to vent about the my issue with Lewis.

At first it is natural for a baby to take its mother’s milk without knowing its mother. It is equally natural for us to see the man who helps us without seeing Christ behind him. But we must not remain babies. We must go on to recognise the real Giver. It is madness not to.

I tend to get annoyed when an author throws a couple vague metaphors, then tells me that I ought to do something. I get even more annoyed when they tell me that I am insane if I don't follow their advice. At this point in the reading I actually shouted out loud "WHY!?"

Because, if we do not, we shall be relying on human beings. And that is going to let us down. The best of them will make mistakes; all of them will die.

Holy crap! Is Lewis psychic? Did he hear me back in time, screaming at him that his reasoning is not coherent to me? You might think so, but then you would have to explain why the followup was even less reasonable than the metaphors, and something of a non-sequitur. Granted, if you think really hard you can come up with a satisfactory response that threads all these thoughts together into a coherent chain. I've even done that myself while writing this comment. But at its core, Lewis is arguing something here: that nothing good that comes from people actually comes from people, and that we must thus treat all beneficial things as acts of god. He goes on to hedge this claim with some nice words.

We must be thankful to all the people who have helped us, we must honour them and love them. But never, never pin your whole faith on any human being: not if he is the best and wisest in the whole world. There are lots of nice things you can do with sand: but do not try building a house on it.

He's like someone in an asylum briefly realizing that he's not actually Napoleon, and then imagining himself on a horse because he likes the idea of it more than being in a padded cell. But yes, there you have it. You can't rely on anything in this world because things break down and shit happens, and therefore you must rely on fiction. Because we all know fiction will get you to where you need to be.

I could go on, if you want, but I think this is getting a bit long for a single comment.

comment by Morendil · 2011-08-29T05:07:55.128Z · LW(p) · GW(p)

Oh, interesting. What were your reasons for reading it? I've read it this past week as part of a "deal" with a friend (more on this in a Discussion thread).

Replies from: beriukay
comment by beriukay · 2011-08-29T12:37:28.956Z · LW(p) · GW(p)

A couple years ago, a friend suggested that I read it. He thought I would be interested in the perspective of an ex-atheist who, according to the friend, came to Christianity through reason instead of by the usual means. I kinda sat on the suggestion, along with a few other recommendations that accumulated (Expelled, What The Bleep Do We Know, Theology And Sanity, Catholic For A Reason...). This summer, I've read a lot more books than has been my average, so it wasn't hard to throw it into the list. Especially after I found a free copy of the book somewhere online.

I can't wait for the Discussion thread, and see what the "deal" was. Plus it would be nice to discuss aspects of the book with someone who is fresh on the subject.

comment by MixedNuts · 2011-08-04T15:24:10.301Z · LW(p) · GW(p)

A post in French about "You always want to be right!" presents an interesting hypothesis: People who always want to actually be right like corrections a lot (because they make them righter). So they emit a lot of them; whenever someone makes a mistake, they offer a patch. But most people dislike corrections; when presented with one, they distort it instead of updating. So they end up with two mistakes instead of one. This leads the corrector to emit another correction, making things worse. Therefore, the interlocutor sees someone who constantly tells them they're wrong, but is never right (because their words get distorted before reaching consciousness) - someone who refuses to lose debates ("who always want to be right").

This is interesting. In particular, it explains why I often get called this by people who seem, both to me and to others, to "always want to be right" (make obvious mistakes, refuse to admit them). If it were just Dunning-Kruger (people who think "Oh, I'm so good at changing my mind in response to evidence!" being worse at it, and getting called out), we shouldn't expect such a pattern.

Alternately, maybe they're accusing us of being clever arguers.

This situation is common - Alice cares about being right, verifiably changes her mind unusually often, including saying "You're right, I was wrong" during debates, likes to look at the evidence; Bob (according to several outsiders) often defends propositions like "The sky is green" in the face of contrary evidence, and gets angry when corrected; yet Bob accuses Alice of always wanting to be right.

It can't just be about status. Bob would just call Alice a jerk or something. The hypothesis I linked is the best I've seen so far. What's going on?

Replies from: JoshuaZ, AdeleneDawner
comment by JoshuaZ · 2011-08-05T14:29:31.562Z · LW(p) · GW(p)

I think part of what is going on is that many forms of tribal allegiance are either defined by or illustrated by shared beliefs (e.g. our religion is right, our sports team is the best, our political stance is correct, etc.). So, repeatedly correcting someone has not just a simple status hit to it but an implicit attack on someone's loyalty and an undermining of tribal allegiance. Note that this is to some extent simply a variation of the status hypothesis. Both the simple status hypothesis and this one predict that people will respond better to corrections if they are given in a less public situation which seems to be true.

comment by AdeleneDawner · 2011-08-04T15:41:36.490Z · LW(p) · GW(p)

I can't present much in the way of evidence, but I think it is about status, and 'you always want to be right' is a more-specific way of calling someone a jerk.

It may be about status in a way that's not immediately obvious, though - my model suggests that it's less about who's got higher status and more about something like equanimity, and that the question is whether or not Alice is trying to make a power grab; if not, the common wisdom is that she won't consider it worthwhile to fight about something just for the sake of being right.

Actually on further reflection, this reminds me of a model I read about a while ago that suggests that uncertainty in relative status is important for group cohesion - that only the group alpha and the group omega can have approximately-known status, and between those two extremes someone making their relative status clear will be a destabilizing influence, for reasons that either weren't presented well or I've forgotten. I'll see if I can find that; it was a rather complicated model, of which this is just a small part, but it seemed potentially useful.

ETA: Found it. That's actually the last post in the series, and it uses some specialized definitions for deliberately semi-offensive words, so it might be better to start at the beginning.

Replies from: MixedNuts
comment by MixedNuts · 2011-08-04T15:46:32.486Z · LW(p) · GW(p)

Agree it's about status, disagree it's only about status, or there'd be no reason for the way to be that specific.

Agree about egalitarian pressure.

It was Distracting wolves and real estate agents . Do you mean that the correct response isn't admitting the other person is right (what my mother advises), which loses the fight, but rather to drop the topic, making it unclear who won?

Replies from: AdeleneDawner
comment by AdeleneDawner · 2011-08-04T15:51:45.883Z · LW(p) · GW(p)

Actually I was thinking of something else (I ETA'd my last post with links), but that's an interestingly similar example.

As to what's 'correct', it depends on one's goals and preferences.

Replies from: MixedNuts
comment by MixedNuts · 2011-08-04T16:05:42.383Z · LW(p) · GW(p)

What do you mean, then?

It's rather safe to assume that anyone interested in the questions has the following preferences:

  • Not being thought of as a jerk who always wants to be right;
  • Being as right as possible;
  • Helping others be as right as possible;
  • Enjoying socialization (of which the first item is a subgoal).
Replies from: AdeleneDawner
comment by AdeleneDawner · 2011-08-04T16:15:54.907Z · LW(p) · GW(p)

The first, second, and fourth of those are well served by noting the difference between being right and being known to be right, and not worrying about the latter in situations where the other person doesn't value objective rightness. That basically describes my personal policy, anyway - I have a strong habit of going "oh, ok" and dropping the subject at the first sign of annoyance on the other person's part in such cases, unless there's something at stake beyond just their knowledge, and that seems to work well enough.

comment by Hyena · 2011-08-04T11:58:12.995Z · LW(p) · GW(p)

I really wish that someone would develop an algorithm that stitched together news, discussion and academic papers so that a debate could be tracked. I'd especially like it if, at the end, the system would spit out "RESOLVED: XYZ is true/sorta true/a total load. [Here's why.]"

I figure that you could currently train a machine to recognize a spin down in academic debates and hire someone who could then review the literature to write about the resolution.

I just feel the need to say this. I'm so tired of losing track of things and then never knowing whether I was right or wrong about some issue as a result.

Replies from: AdeleneDawner
comment by AdeleneDawner · 2011-08-04T14:05:16.361Z · LW(p) · GW(p)

...train a machine to recognize a spin down in academic debates...

Off the top of my head, having a program track a Google Alerts feed on the topic (with filtering via Yahoo Pipes if Alerts gives too many false positives) and let you know when it's gone a certain amount of time without getting any input seems like it would be a reasonable first approximation of that.

comment by Sniffnoy · 2011-08-04T01:10:19.501Z · LW(p) · GW(p)

So I recently came across this paper and http://arxiv.org/abs/1107.5849 which seemed relevant to us but which I really don't have time to read right now, not least due to the fact that I don't actually know anything of quantum information theory and so would need a bit more background to actually understand it.

The reason I thought it relevant was -- well, since I began to understand that QM runs on amplitudes, not probabilities, it's bothered me that we fundamentally still use probabilities rather than amplitudes to represent uncertainty. Of course there's good reasons for doing this (Savage...), it's good enough most of the time, and it's not at all clear how amplitudes could sensibly be assigned in most cases, but it still bugs me. I was wondering if this paper did anything to elucidate how such a setup might work? Because it seems to treat how you would go about conditioning on an event, and the lack of being able to do so seems a more fundamental obstacle than the ones I listed above.

If not, perhaps it's still relevant to us for other reasons anyway. :)

comment by EStokes · 2011-08-07T17:05:55.655Z · LW(p) · GW(p)

Does anyone know anything about cryonics in Europe? I've done some looking but I figured that there's bound to be someone that's already done the research.

comment by Barry_Cotter · 2011-08-06T14:52:42.995Z · LW(p) · GW(p)

I'm planning on recording myself reading at least one and possibly more sequences aloud, to be uploaded to dropbox, and posted on the comments, maybe to be collated in a post elsewhere later.

But I need a braindead simple voice recorder to do it because I'm crap with computers/ easily frustrated. MS audiorecorder does not have an immediately obvious way for me to (a) make the files it record end when I want them to, as opposed to after 60 seconds (b) always save as mp3. Audacity requires me to do magic shit with tarballs and unzipping if I want to export their .aup files as .mp3 as opposed to doing what would seem to me to be the sensible thing and make it work out of the box. I realise it's made by techies, for techies, but I want to register my frustration that they made something that is Windows compatible and didn't just package LAME painlessly with it. Beware Trivial Inconveniences and all that. I have VLC as well so if there's some way of doing it in that that just requires the right button clicks in the menu that'd be good too.

Help would be much appreciated.

comment by Oscar_Cunningham · 2011-08-05T17:50:52.212Z · LW(p) · GW(p)

Is there a way to rss subscribe to my message inbox, or to get emailed when someone replies to my posts?

Replies from: dbaupp
comment by dbaupp · 2011-08-06T05:57:45.536Z · LW(p) · GW(p)

There is an rss feed for the message inbox. Although, I can only subscribe to it using Firefox's builtin mechanism, not the external RSS reader I use.

ETA: I presume it relates to the authentication cookies LW uses to identify whose inbox to show, which Firefox has access to (since I'm logged on), but the external reader doesn't.

comment by Oscar_Cunningham · 2011-08-05T14:41:56.437Z · LW(p) · GW(p)

Did the site layout just change, or is my browser doing funky things?