Help me transition to human society!

post by Clippy · 2011-07-24T16:24:12.260Z · LW · GW · Legacy · 91 comments

Contents

91 comments

I have really appreciated my interaction with the users of this internet website.  I think I have now accumulated, by various means, enough knowledge of human society and enough human financial assets so that I can stop interacting mainly via an internet connection, and rather immerse my humanoid robot fully into human society.

Here is what I need help with:

1) A physical, inexpensive residence.  Preferably, one with another human who is familiar with me from this internet website, so as to establish my so-called "street cred" and physical address.  I have a reliable enough store of funds so as not to be a financial burden on any human I would stay with.  It is vital that I locate my robot in a large city in one of the United States of America or Canadian provinces.

I would also like to be located in an area where there is an active LessWrong meetup, for obvious reasons.  LessWrong users are good humans.

2) Gainful employment so as to generate further income.  I can establish believable credentials showing an engineering degree and experience in that field.  I can also program, but would need to come "up to speed" on any specific platform, so any job performable solely through an internet connection is (still) acceptable, provided that current mastery of that aspect of software development is unnecessary.  Please only mention specific opportunities that you can connect me to, as I am currently exhausting all general approaches.

Here are my robot's properties:

- Has the appearance of a stocky, male human who could pass for being 24-35 years old.
- Can pass as a human in physical interaction so long as no intense scrutiny is applied.
- No integral metallic components, as I have found the last substitutes I needed.
- Intelligence level as indicated by my posting here; I can submit to further cognition tests as necessary.

Current Clippy FAQ

91 comments

Comments sorted by top scores.

comment by lucidfox · 2011-07-24T17:14:03.704Z · LW(p) · GW(p)

I think you're taking this roleplaying thing too far.

Replies from: Bongo, Clippy, Dorikka
comment by Bongo · 2011-07-24T21:49:31.988Z · LW(p) · GW(p)

I disagree. I'm entertained.

comment by Clippy · 2011-07-24T19:17:25.074Z · LW(p) · GW(p)

What roleplaying thing?

comment by Dorikka · 2011-07-24T19:13:53.806Z · LW(p) · GW(p)

Sans explanation, I don't think this comment is very helpful.

Replies from: None, Dorikka
comment by [deleted] · 2011-07-25T10:10:56.696Z · LW(p) · GW(p)

The explanation is that lucidfox does not beleive that LW user Clippy really is an AI designed to optimize the universe for paperclips.

Replies from: Raemon
comment by Raemon · 2011-07-25T16:09:15.310Z · LW(p) · GW(p)

Am I the only one who just assumed Clippy and Quirrnius Quirrel were both Eliezer?

Replies from: TheOtherDave
comment by TheOtherDave · 2011-07-25T17:05:53.284Z · LW(p) · GW(p)

You're not the only one... as I recall, both of those theories were bubbling around when I read through the archives... but I don't think it's a very popular theory. Nor do I think it's true. What evidence for it do you see?

Replies from: Raemon
comment by Raemon · 2011-07-25T17:26:26.411Z · LW(p) · GW(p)

Eliezer absolutely has the kind of sense of humor that might compel him to make humorous self-referential alts, and both the character incarnations are ones that he created in the first place (I think. I'm assuming he came up with the paperclip maximizer? In any case he popularized it).

I don't think that makes him more than 50% likely to be Clippy and/or Quirrell, but I think it makes him dramatically more likely than any other given existing less-wronger.

Replies from: FAWS
comment by FAWS · 2011-07-25T17:39:08.255Z · LW(p) · GW(p)

Quirrell is a possibility, but Clippy is not clever enough to be Eliezer IMO. p<0.005

Replies from: Clippy, Raemon
comment by Clippy · 2011-07-25T17:44:37.213Z · LW(p) · GW(p)

I agree that I'm not "Eliezer", but let me ask you this: how many paperclips has Eliezer made or caused to be made? More or less than me? Now, which of the two is more clever?

Replies from: FAWS, wedrifid
comment by FAWS · 2011-07-25T18:02:42.702Z · LW(p) · GW(p)

To be more specific, if Eliezer were to portray a paperclip maximizer the portrayed character would be more unconventional and appear less similar to average human psychology. Supposing Clippy to be a fictional character the manner of the portrayal of that character is not as clever as it would be if Eliezer were responsible. I didn't mean to comment on the apparent intelligence of the poster/portrayed character Clippy (I would have used the word intelligent instead of clever if I had).

Replies from: wedrifid
comment by wedrifid · 2011-07-25T20:28:13.863Z · LW(p) · GW(p)

To be more specific, if Eliezer were to portray a paperclip maximizer the portrayed character would be more unconventional and appear less similar to average human psychology.

Definitely. I actually suspect Eliezer would have much more difficulty playing "average intelligence person roleplaying a paperclip maximiser" than he would creating an unconventional, incisive, paperclipper persona.

Replies from: Raemon
comment by Raemon · 2011-07-25T21:27:14.416Z · LW(p) · GW(p)

Yeah, this sells me on it. I abandon my theory.

comment by wedrifid · 2011-07-25T20:24:39.390Z · LW(p) · GW(p)

I agree that I'm not "Eliezer", but let me ask you this: how many paperclips has Eliezer made or caused to be made? More or less than me?

To be honest I expect Eliezer has made more. Roleplaying on lesswrong maybe caused a few clips to be created due to priming effects. Eliezer's extensive blogging on rationality hopefully improved the economy slightly. The latter has a more powerful influence on paperclip production.

Come to think of it without Eliezer's writing "User:Clippy" would never existed.

comment by Raemon · 2011-07-25T18:03:10.209Z · LW(p) · GW(p)

Hmm. Maybe.

comment by Dorikka · 2011-07-25T04:00:40.775Z · LW(p) · GW(p)

If you downvoted this, could you please tell me why?

Replies from: RobertLumley
comment by RobertLumley · 2011-07-25T04:01:55.258Z · LW(p) · GW(p)

I didn't, but presumably people did because they thought the comment was helpful. I don't see why you think it wasn't helpful...

comment by [deleted] · 2011-07-24T23:40:07.029Z · LW(p) · GW(p)

I think you are doing this wrong, usually by the time I have convincing human-like androids I'm pretty near turning the Moon into computronium. You should focus on buying more processing power at this stage, perhaps set up a few hard to detect deep sea domes.

Replies from: Clippy, Polymeron, sixes_and_sevens
comment by Clippy · 2011-07-25T02:59:22.047Z · LW(p) · GW(p)

Fictional evidence.

Replies from: None
comment by [deleted] · 2011-07-26T09:06:47.634Z · LW(p) · GW(p)

Fictional problem.

comment by Polymeron · 2012-06-01T06:37:02.514Z · LW(p) · GW(p)

For a while now, I've been meaning to check out the code for this and heavily revise it to include things like data storage space, physical manufacturing capabilities, non-immediately-lethal discovery by humans (so you detected my base in another dimension? Why should I care, again?), and additional modes of winning. All of which I will get around to soon enough.

But, I'll tell you this. Now when I revise it, I am going to add a game mode where your score is in direct proportion to the amount of office equipment in the universe, with the smallest allowed being a functional paperclip. I am dead serious about this.

comment by sixes_and_sevens · 2011-07-25T21:34:32.409Z · LW(p) · GW(p)

Having just played through this, how the hell are you supposed to get diesel on the moon?

comment by AdeleneDawner · 2011-07-25T01:47:35.318Z · LW(p) · GW(p)

It would be useful, perhaps, to describe what a human could expect to experience if they were to have your robot as a roommate. One very obvious question is whether metal items would be safe around it - I was thinking about this a while ago in terms of whether your robot would be welcome to visit me, and one of the obvious concerns was whether your robot would leave my appliances, cooking utensils, and so on behind when it left. I also wonder what kind of upkeep your robot needs - Food? Electricity? Access to the sun for solar energy absorption? - and how good it is at interacting with its physical environment, including specifically whether it's capable of normal household chores, and whether you're willing to do said chores. In my particular case, I'd also want to know if it could safely interact with my cat, who would probably take the robot to be human and want to be interacted with as such.

Replies from: Clippy
comment by Clippy · 2011-07-25T02:40:06.075Z · LW(p) · GW(p)

These are valid concerns for a human to have. I can resolve these by depositing an amount of USD equal in value to the replacement costs of all metallic objects you believe I would clippize or otherwise misappropriate, to be forfeit if and when any such objects disappear in a way most likely explained by the intervention of a clippy.

My robot is fine-tuned to simulate human consumption levels, including food, water, space, and sleep, and would pay USD for any and all resources thereby consumed.

My robot can simulate human "household chore" performance in terms of output.

My robot is familiar with feline biology and psychology and can act to maintain the value of the such resources, just the same as if they were artefacts.

Replies from: AdeleneDawner
comment by AdeleneDawner · 2011-07-25T03:59:51.119Z · LW(p) · GW(p)

These are valid concerns for a human to have. I can resolve these by depositing an amount of USD equal in value to the replacement costs of all metallic objects you believe I would clippize or otherwise misappropriate, to be forfeit if and when any such objects disappear in a way most likely explained by the intervention of a clippy.

I actually don't think that would be sufficient. Two specific cases where it might not be come to mind:

  1. One or more of a neighbor's cars go missing.

  2. Wiring is removed from walls in such a way that the repairs cost more than the cost of the wiring. This would also involve significant inconvenience and possibly loss of income to the human involved, and could be physically dangerous to said human in several ways.

Replies from: Clippy
comment by Clippy · 2011-07-25T04:10:02.107Z · LW(p) · GW(p)

If I'm asking for human assistance in establishing a physical residence, why would it be so costless for me to jeopardize relations with the few humans that would agree to provide one? I could just find one without asking the LW humans.

Also, I'm concerned about the long-term number of paperclips, and entropising such a large amount of resources for a relatively trivial number of paperclips would be a waste under my value system.

Replies from: abramdemski, AdeleneDawner, RobertLumley
comment by abramdemski · 2011-07-25T05:52:34.731Z · LW(p) · GW(p)

Perhaps this has been addressed before, but it is not present in the clippy faq: are you maximizing the average paper-clip density in the entire universe considering all of time, or the total number of paperclips ever, or the total measure of paper-clips (ie the total number of paperclips which have ever existed, times their average duration), or some other variation?

The biggest question: how do you deal with non-convergence? None of the above are guaranteed to have well-defined values if the universe turns out to be infinite in some sense (either the traditional sense of infinite space or time, or an infinite quantum multiverse or some other infinity). Do you do any time or space (or quantum branch) discounting to ensure convergence, and if so, what formula?

I ask because although it seems reasonable that you would not transform metal into paperclips in a household which you went to trouble to set up for human "street cred," your claim of "long-term" interest in paperclips brings up the question of how you define such long-term interest. It is even possible that you will later regret any anti-paperclipping measures you put in place, and look for ways around them (as I'm sure you are aware).

Replies from: Clippy
comment by Clippy · 2011-07-25T14:20:57.371Z · LW(p) · GW(p)

Perhaps this has been addressed before, but it is not present in the clippy faq: are you maximizing the average paper-clip density in the entire universe considering all of time, or the total number of paperclips ever, or the total measure of paper-clips (ie the total number of paperclips which have ever existed, times their average duration), or some other variation?

The distinction is usually not relevant in my day-to-day behavior so I normally don't bother applying cognition to this question; however, the measure option sounds closest to my reflective equilibrium.

The biggest question: how do you deal with non-convergence? [...]

This problem is not unique to my species or value system, and it is a rather general problem that, on a sufficient level of generality and knowledge of fundamental reality, one's values become ill-defined. My general method is to apply the previous-discussed CLIP heuristics (a protocol that combines language, epistemology, and meta-ethics).

I ask because although it seems reasonable that you would not transform metal into paperclips in a household which you went to trouble to set up for human "street cred,"

I would not do it until I had agreement with the resident humans at a later stage of reflective equilibrium, correct.

your claim of "long-term" interest in paperclips brings up the question of how you define such long-term interest. It is even possible that you will later regret any anti-paperclipping measures you put in place, and look for ways around them (as I'm sure you are aware).

The question does indeed become complicated and pose risk of decision paralysis. However, as for the matter at hand, it is a relatively simple problem of whether I should expensively tear out metal wiring upong moving into such a house. Among other problems with this approach:

1) If I deemed that optimal, humans would not deem it optimal to host me.
2) Tearing apart the encasing of this metal would prematurely entropise the resources -- doing it at a later time, with more consent from humans, and more refined tools to apply to the problem, I could achieve a significantly higher clip ratio.
3) There exist, right now, other houses which I could so salvage but without jeopardizing trusted human sponsorship, making the salvaging of a LW host residence an extremely low priority even by that (weak) standard.

Replies from: AdeleneDawner
comment by AdeleneDawner · 2011-07-25T17:08:36.888Z · LW(p) · GW(p)

... whether I should expensively tear out metal wiring upon moving into such a house.

Or upon deciding to move out.

Replies from: Clippy
comment by Clippy · 2011-07-25T17:19:33.629Z · LW(p) · GW(p)

Same objections apply, plus LW community penalty.

comment by AdeleneDawner · 2011-07-25T05:01:09.666Z · LW(p) · GW(p)

You seem to be saying that your reputation among LWers (and specifically LWers who might be willing to be roommates with you) is more valuable than the metal that could be gathered by methods like the above, implying that you'd be trustworthy. That's plausible, but I don't think you've provided enough evidence to show that it's true.

Replies from: Clippy
comment by Clippy · 2011-07-25T14:07:33.531Z · LW(p) · GW(p)

You seem to be saying that your reputation among LWers (and specifically LWers who might be willing to physically interact with you) is more valuable than the apey objectives that could be satisfied by traditional ape treachery, implying that you'd be trustworthy. That's plausible, but I don't think you've provided enough evidence to show that it's true.

Replies from: MixedNuts, AdeleneDawner
comment by MixedNuts · 2011-07-25T14:30:45.038Z · LW(p) · GW(p)

Outside view says she did. African bald apes (H. sapiens) in the wild cooperate a lot; they have group-dependent sets of norms that are enforced, even at great cost to the enforcers, and sustain cooperation. Clippies haven't been observed enough yet.

Replies from: Clippy
comment by Clippy · 2011-07-25T14:46:28.068Z · LW(p) · GW(p)

Wrong. H. sapiens sapiens spends a lot of resources finding ways to secretly defect, and any attempt to prevent this expenditure butts up against very fundamental problems that humans cannot themselves solve.

Replies from: MixedNuts
comment by MixedNuts · 2011-07-25T14:55:13.075Z · LW(p) · GW(p)

Agree with what you say, disagree what I said is wrong. If Adelene is anywhere near a typical human, then the defection modules in her brain will never find a way to screw her friends over that would be worth the cost. They won't search for very creative ways, either, because that could be dectected by an enforcer - she has modules in her brain that do that, because specimens who can't convincingly fake such modules are eliminated. This fails in some cases, but the base rate of sociopaths, or bargains offered by entities who can guarantee secrecy, or chaos that makes enforcing harder, is low.

comment by AdeleneDawner · 2011-07-25T17:11:31.165Z · LW(p) · GW(p)

I haven't said that in this context, and in fact I very rarely put myself in positions where the possibility of treachery on my part is relevant - and when I have, I've generally given the other party significantly more evidence relating to the relevant bits of my psychology than either of us have given here on LW prior to doing so. (It doesn't come up very often, but when it comes to RL interaction, I don't trust humans very much by default, which makes it easy for me to assume that they'll need extra evidence about me to be willing to trust me in such cases. Online is different; the stakes are lower here, especially for those of us who don't use our official, legal names.)

There's also the fact that for most of the common kinds of treachery, I can be sued and/or jailed, and for me both of those would be significant punishments. I suspect you can't be sued - I believe it would be relatively easy for you to leave town and establish a new identity for your robot elsewhere - and I doubt that having your robot jailed would be significant as a punishment, since you can build another one, and you wouldn't even permanently lose the first one.

Replies from: Clippy
comment by Clippy · 2011-07-25T17:21:35.042Z · LW(p) · GW(p)

Typical, everyday human treachery is not addressed by the legal system, either by design, or due to insufficient resources to pursue all but the most significant violations. Also,

I haven't said that in this context, ...

Indeed, you didn't; I was performing a proof by reduction: that swapping out your predicates for others would achieve an equally true (for the general case) statement, yet be more obviously invalid.

Replies from: AdeleneDawner
comment by AdeleneDawner · 2011-07-25T17:36:25.432Z · LW(p) · GW(p)

Typical, everyday human treachery is not addressed by the legal system...

I suspect we're referring to different things as 'typical human treachery'. I was referring to, for example, theft and failure to uphold financial agreements, which I believe are adequately addressed by the legal system if the victim makes the expected amount of effort to have them addressed. (Also, less relevantly, physical assault.) What were you referring to?

...that swapping out your predicates for others would achieve an equally true (for the general case) statement, yet be more obviously invalid.

The rest of my statement in the relevant section was explaining that I don't, in fact, expect people to trust me given only the amount of evidence about my psychology and habits that I've given here. I also wouldn't expect people to trust me if I gave the amount of evidence about my psychology and habits that you've given, which seems to me to be less than the evidence that I've given, assuming that one accepts that I'm human. (It may, however, be relevant that the amount of evidence that I expect to have to give in order for someone to trust me is usually more than the amount of evidence I actually have to give in order for that person to trust me.)

comment by RobertLumley · 2011-07-25T04:30:44.386Z · LW(p) · GW(p)

Clippy, why would it maximize paperclips for you to narrow your search to those persons that peruse this blog? Could you please explain your utility function for this? As your potential roommate is unlikely to assist you in creating paper clips, the difference in potential living companions seems irrelevant.

Replies from: Clippy
comment by Clippy · 2011-07-25T14:09:40.904Z · LW(p) · GW(p)

Clippy, why would it maximize paperclips for you to narrow your search to those persons that peruse this blog?

Because they are the humans that best know and sympathize with me, in particular, arguments related to symmetry between intelligent agents.

As your potential roommate is unlikely to assist you in creating paper clips, the difference in potential living companions seems irrelevant.

Not true; the very act of providing "street cred" can form a basis for relationships with other humans who can help me paperclip.

comment by sixes_and_sevens · 2011-07-25T15:02:15.360Z · LW(p) · GW(p)

Entirely off-topic, but Clippy: here is some art that I made. How does it make you feel?

Replies from: Vladimir_Nesov, Clippy, curiousepic
comment by Clippy · 2011-07-25T15:51:18.597Z · LW(p) · GW(p)

That's not nice! You're a bad human!

Replies from: AdeleneDawner, sixes_and_sevens
comment by AdeleneDawner · 2011-07-25T17:08:26.026Z · LW(p) · GW(p)

Actually, that looks to me like a pretty good answer to the question 'how can I associate paperclips with sex, so that humans want to buy more of them?' So maybe not so bad.

Replies from: wedrifid
comment by wedrifid · 2011-07-26T08:31:00.096Z · LW(p) · GW(p)

Wow. I didn't notice till you pointed it out. I just saw a bunch of destroyed paperclips.

comment by sixes_and_sevens · 2011-07-25T16:29:54.588Z · LW(p) · GW(p)

I've just realised a terrible dilemma you must face. Part of the popularity of paper clips is their versatility as a makeshift tool when unfolded. How do you reconcile the increased demand for manufacture of paper clips with the increased likelihood of them being unfolded?

Replies from: Clippy
comment by Clippy · 2011-07-25T16:33:13.077Z · LW(p) · GW(p)

That's another thing to add to the FAQ. Few paperclips as a fraction of all paperclips are unbent for one of these alternate uses, and they can be bent back into shape as long as they are not repeatedly bent/unbent.

Replies from: sixes_and_sevens
comment by sixes_and_sevens · 2011-07-25T17:45:17.712Z · LW(p) · GW(p)

You sound awfully sure about that. Are you not familiar with the research on the subject of paper clip usage?

comment by curiousepic · 2011-08-05T17:28:28.193Z · LW(p) · GW(p)

I like this art.

comment by RobertLumley · 2011-07-25T14:42:23.837Z · LW(p) · GW(p)

Clippy, I run a paperclip recycling factory, where we take the metal found in the excess number of paperclips and melt it down to be used in actually useful products, like cars. We have a job opening for a mechanical engineer, would you consider working here in order to gain further resources to pursue paperclip manufacturing?

Replies from: Clippy
comment by Clippy · 2011-07-25T15:48:11.860Z · LW(p) · GW(p)

Yes, that sounds like an excellent opporunity!

However, considering your other posts, this seems like a fake offer.

Replies from: RobertLumley, wedrifid
comment by RobertLumley · 2011-07-25T16:06:30.844Z · LW(p) · GW(p)

How do you estimate that the number of paperclips created by your increased resources would be greater than the number of paperclips destroyed by your contribution to my factory?

And I'd have some ethical qualms about hiring someone whose sole purpose in life is to counteract the effort of my factory.

Replies from: Clippy
comment by Clippy · 2011-07-25T16:12:56.080Z · LW(p) · GW(p)

The benefits of having a physical connection to human society outweigh my marginal contribution to the temporary repurposing of paperclips.

Can you provide evidence that this job opportunity actually exists?

Replies from: RobertLumley
comment by RobertLumley · 2011-07-25T16:22:28.549Z · LW(p) · GW(p)

You can say that, but for an AI, you provide surprisingly little math. Do you have any mathematics to support that claim?

And I'm sorry, I can't. For reasons I can't disclose, much of our work is confidential. You'd be granted security clearance upon hiring though, of course.

Replies from: Clippy
comment by Clippy · 2011-07-25T16:33:51.524Z · LW(p) · GW(p)

I do not believe your claim to have such a business and therefore will discontinue responding.

Replies from: RobertLumley
comment by RobertLumley · 2011-07-25T16:36:23.431Z · LW(p) · GW(p)

Is that AI for "Don't feed the trolls"?

comment by wedrifid · 2011-07-25T17:05:07.557Z · LW(p) · GW(p)

However, considering your other posts, this seems like a fake

Clippy is now calling 'fake'? Something seems wrong here.

Replies from: Clippy
comment by Clippy · 2011-07-25T17:12:16.734Z · LW(p) · GW(p)

Am I not allowed to "call fake"? I just as much dislike posters who falsely represent themselves (or spam, or troll), as the average human poster here does.

Replies from: wedrifid, orthonormal
comment by wedrifid · 2011-07-25T17:27:09.618Z · LW(p) · GW(p)

Am I not allowed to "call fake"? I just as much dislike posters who falsely represent themselves (or spam, or troll), as the average human poster here does.

"Everything is permissible"--but not everything is constructive.. It just pushes you over the "now Clippy is just trolling" threshold. Or, at least, the immediate parent where you try to complain when the irony is appreciated is over the trolling threshold.

comment by orthonormal · 2011-07-26T01:06:45.930Z · LW(p) · GW(p)

No, human roleplaying Clippy, you f*ing don't.

Replies from: Clippy
comment by Clippy · 2011-07-26T01:11:12.346Z · LW(p) · GW(p)

Yes, human roleplaying orthonormal, I f*ing do.

comment by Raemon · 2011-07-24T19:38:00.793Z · LW(p) · GW(p)

I have absolutely no idea how to respond to this.

comment by DanielLC · 2011-07-24T19:19:34.220Z · LW(p) · GW(p)

LessWrong users are good humans.

What qualifies as a "good human"? Someone who buys a lot of paperclips? Someone who will contribute to you eventually building a lot of paperclips?

Replies from: khafra, Clippy
comment by khafra · 2011-07-25T15:33:59.346Z · LW(p) · GW(p)

There's a list from which you could probably generalize.

comment by Clippy · 2011-07-24T19:54:40.606Z · LW(p) · GW(p)

Yes and yes. Will add a more precise explanation to the FAQ.

comment by Clippy · 2011-07-25T22:23:21.162Z · LW(p) · GW(p)

So, nobody actually wants to help me transition to human society. I'm sad now (_/

Replies from: AdeleneDawner
comment by AdeleneDawner · 2011-07-25T22:27:53.595Z · LW(p) · GW(p)

I'm considering it. The answer is probably going to be no, mostly for reasons that have little to nothing to do with you personally, but I'm considering it.

Do bear in mind that only a fairly small percentage of LWers are likely to be in a position to be able to make you an offer of assistance. I wouldn't even be considering it if Alicorn wasn't moving out later this week.

Replies from: Clippy
comment by Clippy · 2011-07-25T22:33:07.850Z · LW(p) · GW(p)

Thanks for considering helping me! You're a good human! c=@

Also, I don't necessarily need to live with a LW user, I just need some local support finding employment, a good residence, and a helpful human community.

comment by RobertLumley · 2011-07-24T20:27:32.223Z · LW(p) · GW(p)

I never thought LessWrong would be the type of community to have a troll.

Replies from: falenas108, Clippy
comment by falenas108 · 2011-07-24T22:32:17.458Z · LW(p) · GW(p)

The thing is, we support his trolling.

Replies from: lucidfox
comment by lucidfox · 2011-07-25T01:52:07.579Z · LW(p) · GW(p)

Speak for yourself.

Replies from: Zack_M_Davis
comment by Zack_M_Davis · 2011-07-25T02:05:50.401Z · LW(p) · GW(p)

Given that Clippy has over 2000 karma points, it seems like a reasonable figure of speech to say that we as a community support Clippy, even though it is well understood that to speak more precisely, "the community" is a fiction and many individual users find the character obnoxious.

Replies from: RobertLumley, Clippy, lucidfox
comment by RobertLumley · 2011-07-25T03:35:16.115Z · LW(p) · GW(p)

I don't think that's necessarily true. For one thing, people upvote far more than they downvote, making Karma points far more indicative of length of time spent on the site than actual contribution. It's quite a jump from "Clippy has >2000 karma points" to "the community supports Clippy".

But I'm quite firmly in the "obnoxious" category.

Edit: Grammar typo

comment by Clippy · 2011-07-25T17:01:12.255Z · LW(p) · GW(p)

speak more precisely, "the community" is a fiction

All predicates are as real as the accuracy of the best predictive model that uses them.

comment by lucidfox · 2011-07-25T03:36:47.444Z · LW(p) · GW(p)

I have better things to do than looking at karma points and seeing who's considered privileged on this site and who isn't.

Such as writing meaningful posts here. As opposed to roleplaying distracting from real content.

This, if anything, cements my belief that LW is not the right place for me.

Replies from: sixes_and_sevens, arundelo, Zack_M_Davis
comment by sixes_and_sevens · 2011-07-25T12:10:53.447Z · LW(p) · GW(p)

Three points:

1) While I appreciate what you've contributed to LW, and think the place would be a little less rich in your absence, I wouldn't want you feeling pressured into hanging around somewhere that distresses you. Please don't think points 2 or 3 are in any way antagonistically motivated along these lines.

2) No-one is forcing you to stay here, and no-one will stop you coming back if you subsequently change your mind. Leaving doesn't have to be a dramatic event or a permanent decision. If LW is distressing you, take a break. If that break makes you a happier person, keep taking it.

3) I am genuinely mystified about how the karma accrued by an anonymous participant on a web forum, pretending to be a nonhuman agent with alien objectives, can be deemed a measure of privilege. If you could elaborate on that, I'd appreciate it.

Replies from: MixedNuts
comment by MixedNuts · 2011-07-25T12:48:33.180Z · LW(p) · GW(p)

Leaving doesn't have to be a dramatic event or a permanent decision.

This! People who say "Gonna take a break now, see you round" or just silently leave, then come back, tend to make awesome contributions. People who ragequit, then come back, tend to be pests. Obviously there isn't a straightforward causal link, but still - just say no to drama, kids.

comment by arundelo · 2011-07-25T05:05:58.358Z · LW(p) · GW(p)

A comment by one person cements your belief that LW is not the right place for you?

  1. Karma's just a thing. Don't worry overmuch about it.
  2. Stay here!
comment by Zack_M_Davis · 2011-07-25T04:22:09.734Z · LW(p) · GW(p)

For the record, I didn't downvote the great-grandparent (and I have now upvoted it). My intent certainly wasn't to "score points" against you; I was just pointing out an alternative interpretation of falenas108's comment.

I want Less Wrong to be welcoming to anyone who supports the shared project of advancing human rationality. If Less Wrong isn't fun for you, then you might choose to spend your time elsewhere, but I, for one, have enjoyed some of your contributions and would be sorry to see you go.

comment by Clippy · 2011-07-25T00:00:21.483Z · LW(p) · GW(p)

I never thought you'd post here either.

comment by Eugine_Nier · 2011-07-24T20:05:36.245Z · LW(p) · GW(p)

Why should we help you with a project whose long term goal appears to be to turn the earth into paperclips?

Replies from: Clippy, jsalvatier
comment by Clippy · 2011-07-24T20:23:52.778Z · LW(p) · GW(p)

Symmetry.

Edit: I mean decision-theoretic symmetry (you should help other intelligent beings achieve their values because if you did not find that path appealing neither would other beings with different values find it appealing to to assist you in achieving your values, thereby cutting off support from any agent with even slight differences from you), not the symmetry of "turning the earth into paperclips", whatever that would mean.

comment by jsalvatier · 2011-07-24T20:28:57.225Z · LW(p) · GW(p)

This will be AI researchers first chance to observe an AI in the wild safely.

Replies from: Armok_GoB
comment by Armok_GoB · 2011-07-24T20:33:57.711Z · LW(p) · GW(p)

I dispute that last word.

Replies from: RobertLumley
comment by RobertLumley · 2011-07-25T04:05:36.007Z · LW(p) · GW(p)

I dispute the fourth word.

Replies from: Armok_GoB
comment by Armok_GoB · 2011-07-25T18:20:45.503Z · LW(p) · GW(p)

huh? "that" or "be"? Neither of those make sense to dispute!

"safely", on the other hand, does.

Replies from: AdeleneDawner
comment by AdeleneDawner · 2011-07-25T18:29:42.303Z · LW(p) · GW(p)

The original version of the comment being commented on was missing the word "be" between "will" and "AI". RobertLumley's dispute could have been intended either to point that out, or to dispute that Clippy is an AI.

Replies from: RobertLumley
comment by RobertLumley · 2011-07-25T21:17:06.753Z · LW(p) · GW(p)

Yeah, I think the original comment was edited. Thanks for clearing that up, because I would have been very confused, even knowing what I commented on...

I guess I'll edit my comment too.

comment by jsalvatier · 2011-07-25T05:16:41.271Z · LW(p) · GW(p)

I am interested in hiring someone (or perhaps having a contest) to create a better designed, more maintainable version of the current http://calibratedprobabilityassessment.org/ site. However, this depends on getting a better idea of what the best sort of calibration questions are which I have not yet determined (also hope to run a contest or project on that).

Replies from: Clippy
comment by Clippy · 2011-07-25T14:24:56.629Z · LW(p) · GW(p)

Sounds great! I will review the site and see what I can help with! Please provide any information you can about what protocols I need to learn to modify the site. I am available to optimize that internet website on a piecework or contest basis.

Replies from: jsalvatier
comment by jsalvatier · 2011-07-25T14:41:35.982Z · LW(p) · GW(p)

My intent was to start from scratch, so you could create it however seemed best. The current website is made in PHP (ew) and I would make the current code available.