Posts

Brain-in-a-vat Trolley Question 2012-12-30T03:22:51.335Z
Hypothetical scenario 2012-02-16T06:56:09.248Z
Second Life creators to attempt to create AI 2011-01-09T13:50:59.337Z
Does anyone else find ROT13 spoilers as annoying as I do? 2010-10-23T17:15:04.957Z
Green consumers more likely to steal 2010-10-23T11:42:46.099Z
Retirement leads to loss of cognitive abilities 2010-10-14T17:53:57.292Z
1993 AT&T "You Will" ads 2010-10-14T12:49:24.851Z
Swords and Armor: A Game Theory Thought Experiment 2010-10-12T08:51:28.673Z
Random fic idea 2010-10-02T08:02:04.945Z
Hypothetical - Moon Station Government 2010-09-29T15:57:33.884Z

Comments

Comment by nick012000 on Hypothetical scenario · 2012-02-16T07:18:56.139Z · LW · GW

In this scenario, it has not yet engaged the bulk of the forces of the US military. It's wiping out the brass in the Pentagon, not fighting US Soldiers.

Besides, soldiers usually act on orders, and the lines of communication are sort of in chaos at the moment due to the sudden decapitation.

Comment by nick012000 on What happens when your beliefs fully propagate · 2012-02-16T07:12:17.388Z · LW · GW

Oh, wow. I was reading your description of your experiences in this, and I was like, "Oh, wow, this is like a step-by-step example of brainwashing. Yup, there's the defreezing, the change while unfrozen, and the resolidification."

Comment by nick012000 on A Rationalist's Account of Objectification? · 2011-03-24T02:00:06.762Z · LW · GW

That's not what a real apology looks like. Better would be "I'm sorry. I can see now that I shouldn't have said what I said in a forum such as this."

I can see what you mean, but I would be more likely to say something like "I'm sorry; I didn't mean to make you uncomfortable." The reason I said it is because this thread seemed like the best place to say it, so saying that I shouldn't have said it here is obviously incorrect.

suggest that Alicorn 'cybers' you, or even 'put' the image of cybering 'out there'. This is doing exactly what Alicorn doesn't want, namely making your interaction on this forum "sexually charged".

Huh? I was trying to do the opposite; to reassure her that it wasn't sexually charged, because she wasn't cybering with me. O_o

Comment by nick012000 on A Rationalist's Account of Objectification? · 2011-03-23T11:40:24.638Z · LW · GW

I'm sorry if I made you feel uncomfortable; that wasn't really my intent. Getting assistance in better compartmentalisation techniques was my intent, though I figured I'd get some downvotes given that the Less Wrong community usually tries to reduce compartmentalization, not increase it, though decreasing compartmentalisation does not seem like a good idea in this case for the reasons I laid out in my previous post.

I assure you, I did not post that for any sort of sexual thrill; it'd take something like cybersex or an erotic story for me to get a sexual thrill out of anything I've written, so unless you start cybering with me or something, you're safe, Alicorn. ;) I'm simply open about that part of my sex life, partly because of Asperger's Syndrome mind-blindness, and partly because I'm planning on working in a sensitive field once I finish university and I won't need to worry about being blackmailed about it if I'm not worried about people finding out.

Comment by nick012000 on A Rationalist's Account of Objectification? · 2011-03-22T13:57:58.268Z · LW · GW

Well, obviously there's a difference between violently throwing someone into a bed, and joking around and playfully pushing them on the shoulder to signal them to get into the bed, but my point is that the studies conflate the two and everything in between them and classify them all as rape. Just check "yes" in the box, and voila, you're a rapist.

Comment by nick012000 on A Rationalist's Account of Objectification? · 2011-03-22T12:42:03.990Z · LW · GW

Personally, I like objectifying women. I get erotic pleasure from it, along with a lot of other things that involve women being degraded and humiliated; put simply, my fetish is for the lowering of women's status.

Obviously, I would need to compartmentalise this to function in day to day society, as well as avoid violations of ethics; rape is, after all, very wrong, even if it is a quite sexy idea. So, would any of the other Less Wrongers be willing to help me more efficiently box it off, so I can open it up without needing to do what amounts to mentally chanting "SLUT SLUT SLUT GONNA RAPE YOU AND FILL YOU WITH CUM" whenever I want to masturbate to pornography, and to minimize leak-through so I'll stop doing things like licking my lips when I see a sexy woman.

Comment by nick012000 on A Rationalist's Account of Objectification? · 2011-03-22T12:30:09.219Z · LW · GW

Considering that some feminists have argued that all heterosexual sex is rape, he's not exaggerating that much. The ones who make the studies he was referencing do things like making questionnaires that ask questions like "Have you ever pushed a girl into bed to make her have sex with you?" and counting that as rape to inflate the statistics, because more rapes = more money for the rape services they work for.

Comment by nick012000 on What comes before rationality · 2011-03-20T11:01:09.293Z · LW · GW

Fantasise about brutally murdering the abusers. You'll probably feel a lot better once you're done, and child abusers are a socially acceptable target for all the hate you feel like mustering.

Comment by nick012000 on What comes before rationality · 2011-03-20T10:46:42.512Z · LW · GW

Why worry about Google stockpiling your personal information when people are entirely capable of profiling you anyway!

Comment by nick012000 on Procedural Knowledge Gaps · 2011-02-09T16:07:19.163Z · LW · GW

I've read that singing can allow people who stutter to speak relatively normally, since it uses a different part of the brain to normal speech.

Comment by nick012000 on Procedural Knowledge Gaps · 2011-02-09T16:04:56.809Z · LW · GW

If you don't know it intuitively (because of Apserger's Syndrome or the like), about all I can recommend is hard work and effort; the differences can be fairly subtle, and depend on the context of the situation and the relationships between the people involved.

Sorry I can't be more helpful; I have Asperger's Syndrome myself even if I've learned to fake being normal pretty well as I grew up, so I understand how frustrating a lack of social skills can be.

Comment by nick012000 on Isn't this sitemeter logging a bit too excessive? · 2011-02-03T11:48:50.971Z · LW · GW

Does this offer any functionality NoScript doesn't? I've already got the latter installed, but I'd want to know if it would be a waste of time to install this as well.

Comment by nick012000 on Counterfactual Calculation and Observational Knowledge · 2011-02-02T22:45:29.112Z · LW · GW

I take out a pen and some paper, and work out what the answer really is. ;)

Comment by nick012000 on Second Life creators to attempt to create AI · 2011-01-09T17:21:21.386Z · LW · GW

Oh, they'd almost certainly get an unFriendly AI regardless of how they parented it, but bad parenting could very easily make an unFriendly AI worse. Especially if it interacts a lot with the Goreans, and comes to the conclusion that women want to be enslaved, or something similar.

Comment by nick012000 on Applied Optimal Philanthropy: How to Donate $100 to SIAI for Free · 2011-01-09T14:07:47.224Z · LW · GW

That of all the money devalued by the inflation caused by printing money.

Comment by nick012000 on The Sword of Good · 2011-01-09T12:33:16.552Z · LW · GW

Yeah, you have to register to view the board, and yeah, it's the Perfect Lionheart fic. The reason that thread's gotten so many posts and the story's gotten so much negative feeling about it, though, is because it started off looking good, was well-written (as far as the technical aspects of writing like spelling, grammar, and so on go), and had occasional teases in a scene here and there that it might manage to redeem itself.

If it was simply poorly written it would have been dismissed as just another piece of the sea of shit that makes up 90% of ff.net.

Comment by nick012000 on Harry Potter and the Methods of Rationality discussion thread, part 6 · 2011-01-09T10:17:20.600Z · LW · GW

You know, I know it was just an omake, I could actually see Shirou using Unlimited Bayes Works in a serious fic. Reality Marbles derive from minds which are alien to the common sense of humanity, and as we all know, humans are anything but properly rational. Kiritsugu Emiya already told Shirou the basics of his moral system in canon; it wouldn't take too much more elaboration for Shirou to pick up "Magi are supposed to be rational about doing good" as well as "Sometimes, in order to save people, people have to die."

Then he'd just throw himself into making himself rational the same way he threw himself at physical exercises following Kiritsugu's death (that is, with single-minded devotion to the practice), and before you know it, it'd be time for Unlimited Bayes Works. That said, if he did become a more rational person, he'd probably know that cleaning the Archery Club floors is not the most efficient use of his time, and tell Shinji to go fuck himself, thereby preventing him from seeing Lancer and Archer fighting, and from there Lancer's attempts at killing him and Saber's summoning, so the story would take a different turn, right from Day One.

Comment by nick012000 on Applied Optimal Philanthropy: How to Donate $100 to SIAI for Free · 2011-01-09T09:57:12.015Z · LW · GW

Nevertheless, TANSTAAFL. The incentive here is being paid for in other ways, and you'd need to determine the opportunity costs of that money going somewhere else instead.

Comment by nick012000 on Variation on conformity experiment · 2010-11-09T18:48:45.179Z · LW · GW

I'm just offering an explanation as to the lack of response on that topic; I don't think I've been voted down on that subject largely because I've taken care to avoid it; I don't want to get banned for trolling. That sort of thing's happened to me before.

Comment by nick012000 on Variation on conformity experiment · 2010-11-09T18:25:36.402Z · LW · GW

I'm a little surprised nobody has commented on the sex difference yet. Any ideas about its significance? We can only speculate, of course, but when has that ever stopped anyone?

Probably because they didn't want to get negative karma for appearing misogynistic.

Comment by nick012000 on Making the Universe Last Forever By Throwing Away Entropy Into Basement Universes? · 2010-11-09T18:23:15.269Z · LW · GW

Wouldn't the Second Law of Thermodynamics mean that transferring entropy this way would, in turn, generate entropy in its own right? You might be able to make the universe last longer, but I don't think you'd be able to make it last forever. Even if you could, though, you'd still run into the problem of proton decay eventually.

Comment by nick012000 on Information Hazards · 2010-11-09T18:20:24.034Z · LW · GW

Wasn't this on the Singularity Institute's website before? I could swear I've already read this paper somewhere else.

Comment by nick012000 on Conjoined twins who share a brain/experience? · 2010-11-07T13:45:28.490Z · LW · GW

That is fascinating; the doctors in question should definitely apply for a research grant to help decrease the medical costs involved; they're an invaluable source of medical information. The potential benefits to DNI technologies would be staggering.

Comment by nick012000 on What I would like the SIAI to publish · 2010-11-07T09:59:59.333Z · LW · GW

Any other possible effects don't negate that you're killing six million people when you're going ahead with a potentially UnFriendly AI.

Comment by nick012000 on Harry Potter and the Methods of Rationality discussion thread, part 5 · 2010-11-07T09:30:53.336Z · LW · GW

(chapter 57)

Did Harry just Transfigure a shotgun?

Comment by nick012000 on What I would like the SIAI to publish · 2010-11-07T09:11:25.751Z · LW · GW

Define "sufficiently low"; with even a 99.9% chance of success, you've still got a .1% chance of killing every human alive; that's morally equivalent to a 100% chance of killing 6.5 million people. Saying that if you're not totally sure that if you're not totally sure that your AI is Friendly when you start it up, you're committing the Holocaust was not hyperbole in any way, shape, or form. It's simply the result of shutting up and doing the multiplication.

Comment by nick012000 on What I would like the SIAI to publish · 2010-11-03T22:07:15.976Z · LW · GW

I, personally, think that you've misunderstood something. It's not that designing a non-provably Friendly AI is almost certain to destroy humanity, it's that it's possible and it's good engineering practice is to assume the worst is going to happen and design based on that.

Comment by nick012000 on Luminosity (Twilight fanfic) discussion thread · 2010-10-26T06:33:42.091Z · LW · GW

So, "If they can hurt me, they'll rip me apart and set me on fire until I die" won't work to make herself nigh-invulnerable?

Or, for that matter, "I need allies if I am going to survive the Volturi, therefore I need to join the still-free pack's hivemind"?

Comment by nick012000 on Luminosity (Twilight fanfic) discussion thread · 2010-10-25T08:56:08.774Z · LW · GW

Is it just me, or does it look like Bella's true power is that she can do anything so long as she can imagine it and honestly justify it as necessary to maintain the integrity of her mind?

If so, when she inevitably goes for revenge for Edward's death against the Volturri, she could very possibly just deprogram the werewolves and splatter the vampires because if she doesn't, they'll rip her to bits and then light her on fire and not stop until she's properly ashes.

At the very least, she should be able to think "If my body is damaged, my mind will be destroyed when I inevitably lose the fight; therefore, my body cannot be damaged" and become nigh-invulnerable physically.

Comment by nick012000 on Archimedes's Chronophone · 2010-10-25T08:51:41.688Z · LW · GW

No, it'd tell him that you're arguing against the local belief structure regarding slavery. In his time, it'd be an argument against slavery.

Comment by nick012000 on The Sword of Good · 2010-10-23T16:36:14.650Z · LW · GW

In writing it's even simpler - the author gets to create the whole social universe, and the readers are immersed in the hero's own internal perspective. And so anything the heroes do, which no character notices as wrong, won't be noticed by the readers as unheroic. Genocide, mind-rape, eternal torture, anything.

Not true. If you've got some time to kill, read this thread on The Fanfiction Forum; long story short, a guy who's quite possibly psychopathic writes a story wherein Naruto is turned into a self-centered, hypocritical bastard who happily mindrapes every woman around him, and the people on the forum spend 60-odd pages lambasting him.

Comment by nick012000 on Harry Potter and the Methods of Rationality discussion thread · 2010-10-23T12:56:38.093Z · LW · GW

A thought: some of Voldemort's followers were from Easter Europe. I wonder what the odds that they had support from the other side of the Iron Curtain were?

Comment by nick012000 on Luminosity (Twilight fanfic) discussion thread · 2010-10-23T12:53:07.984Z · LW · GW

Hmm. Actually, if they pretend to be friendly, none of the werewolves has talked, and Aro hasn't arrived yet, they might be able to dupe the Volturri into thinking that they were sent there by the Cullen coven to look into something going down on their territory, so as to get close enough to the siblings to be able to sneak attack and disable or kill them and allow the werewolves to polish off the rest.

Comment by nick012000 on Luminosity (Twilight fanfic) discussion thread · 2010-10-23T12:00:07.194Z · LW · GW

All Bella needs to do is take out the two vampires with incapacitating powers. Once she does, the wolves can act as the anti-vampire killing machines they were born to be and take out the rest of them.

Once they do, her position's a lot stronger; Bella and the packs might be able to negotiate a peace treaty with/unconditional surrender from the Volturri, assuming they don't go onto the offensive and wipe them out once their big guns are gone. IIRC the werewolf packs outnumber the guard by, what, two or three to one, while being physically superior to boot.

Comment by nick012000 on The Singularity in the Zeitgeist · 2010-10-23T00:12:15.018Z · LW · GW

Ah. You sort of implied that you were. No worries, then.

Comment by nick012000 on Archimedes's Chronophone · 2010-10-22T19:05:24.072Z · LW · GW

It's simple. I'd make the best damn argument for slavery I could, knowing that the chronophone will invert it into the best damn argument against slavery I could give.

Comment by nick012000 on Does it matter if you don't remember? · 2010-10-22T18:56:02.549Z · LW · GW

I'd say it'd be a bad thing, since it'd result in wasteful expenditures of resources by the AI, as well as maladaptive learning by the children as they grow up; what if they go somewhere outside the AI's control?

Comment by nick012000 on Greg Egan disses stand-ins for Overcoming Bias, SIAI in new book · 2010-10-22T18:53:06.581Z · LW · GW

What the heck was up with that, anyway? I'm still confused about Yudkowsky's reaction to it; from what I've pieced together from other posts about it, if anything, attracting the attention of an alien AI so it'll upload you into an infinite hell-simulation/use nanobots to turn the Earth into Hell would be a Good Thing, since at least you don't have to worry about dying and ceasing to exist.

Even if posting it openly would just get deleted, could someone PM me or something? EDIT: Someone PMed me; I get it now. It seems like Eleizer's biggest fear could be averted simply by making a firm precommitment not to respond to such blackmail, and thereby giving it no reason to commit such blackmail upon you.

Comment by nick012000 on The Singularity in the Zeitgeist · 2010-10-22T18:22:04.315Z · LW · GW

I think that the poster in question was assuming that you were unfamiliar with the Singularity in general, rather than enquiring as to the nature of the Singularity that occurred in-comic in particular.

Or, possibly, that you were silly enough to confuse the QC world with our own; they've had Strong AI since the start of the comic, after all, a superhero who delivers pizzas, and one one the cast grew up on a space station. Needless to say, it only appears similar to ours since we're just seeing the lives of a small circle of hipsters who run a coffee shop and an office-bitch-turned-librarian. I'd imagine that, say, their US Military probably looks quite different to ours.

Comment by nick012000 on Open Thread · 2010-10-22T11:54:04.425Z · LW · GW

Here's a couple comics the folks here might find amusing:

http://www.questionablecontent.net/view.php?comic=1777

http://www.questionablecontent.net/view.php?comic=1780

Comment by nick012000 on How do autistic people learn how to read people's emotions? · 2010-10-22T11:33:04.452Z · LW · GW

I got a 23, myself; considering that I'm a diagnosed Aspie, that's not too bad, I suppose. I can pretend to be normal fairly well, anyway; it's mostly the stuff about getting stuck in routines that trips me up nowadays.

Comment by nick012000 on 1993 AT&T "You Will" ads · 2010-10-14T16:50:23.820Z · LW · GW

Heh. I suppose that this is why AT&T wasn't the company to bring about the things they mentioned, then!

Comment by nick012000 on Free Hard SF Novels & Short Stories · 2010-10-14T15:37:14.298Z · LW · GW

Does Schlock Mercenary count as Hard Scifi? What about Freefall? They've both got FTL travel, and the former has other fairly miraculous technologies (like the matter annihilation plants and artificial gravity systems intimately related to them), but they're well thought out with the rational implications of them seen and discussed.

Comment by nick012000 on Of the Qran and its stylistic resources: deconstructing the persuasiveness Draft · 2010-10-14T14:40:01.813Z · LW · GW

Warhammer 40k. This is a website packed to the gills with nerds; of course we'd get the reference. ;)

Comment by nick012000 on Discuss: Have you experimented with Pavlovian conditioning? · 2010-10-14T12:46:13.431Z · LW · GW

I've read that Pavlovian conditioning can be used to trigger orgasms on demand, at least for women. Supposedly, you just whisper a particular word right before she's about to orgasm, and eventually, just saying the word outside of sex will be enough to trigger a less-intense orgasm by itself. Of course, the conditioning would wear off over time, as well as if it's repeatedly used without upkeep, but it could well be a fun and relatively harmless kinky thing for a couple to experiment with.

I'm a virgin, so I've obviously never tried this myself, so I suppose you might want to take this with a grain of salt, and of course, everyone's different so YMMV.

Comment by nick012000 on Swords and Armor: A Game Theory Thought Experiment · 2010-10-13T06:18:23.126Z · LW · GW

Nice chart. This one's better, though; it clearly lists which sword and armor win, as well as listing number of losses and ties. I got it from the same thread as the one in the OP; I was just waiting until someone suggested doing something like this before I posted it.

Didn't want to take away your fun, after all. ;)

Comment by nick012000 on The Irrationality Game · 2010-10-12T05:32:24.380Z · LW · GW

Well, most of the arguments against it are, to my knowledge, start with something along the lines of "If time travel exists, causality would be fucked up, and therefore time travel can't exist," though it might not be framed quite that implicitly.

Also, if FTL travel exists, either general relativity is wrong, or time travel exists, and it might be possible to create FTL travel by harnessing the Casimir effect or something akin to it on a larger scale, and if it is possible to do so, a recursively improving AI will figure out how to do so.

Comment by nick012000 on The Irrationality Game · 2010-10-11T15:32:07.475Z · LW · GW

If an Unfriendly AI exists, it will take actions to preserve whatever goals it might possess. This will include the usage of time travel devices to eliminate all AI researchers who weren't involved in its creation, as soon as said AI researchers have reached a point where they possess the technical capability to produce an AI. As a result, Eleizer will probably have time travelling robot assassins coming back in time to kill him within the next twenty or thirty years, if he isn't the first one to create an AI. (90%)

Comment by nick012000 on The Irrationality Game · 2010-10-11T15:17:41.846Z · LW · GW

Voted up for under-confidence. God exists, and he defined morality the same way he defined the laws of physics.

Comment by nick012000 on The Irrationality Game · 2010-10-11T15:08:48.847Z · LW · GW

God exists, and He created the universe. He prefers not to violate the physical laws of the universe He created, so (almost) all of the miracles of the Bible can be explained by suspiciously fortuitously timed natural events, and angels are actually just robots that primitive people misinterpreted. Their flaming swords are laser turrets. (99%)