Ideas wanted: democracy in an Em world

post by Stuart_Armstrong · 2013-06-06T14:05:01.730Z · LW · GW · Legacy · 64 comments

Contents

64 comments

One person, one vote - a fundamental principle of our democratic government. But what happens in a world where one person can be copied, again and again?

That is the world described by Robin Hanson's "Em economics". Ems, or uploads, are human minds instantiated inside software, and hence can be copied as needed. But what is the fate of democratic government in such a world of copies? Can it be preserved? Should it be preserved? How much of it should be preserved? Those are the questions we'll be analysing at the FHI, but we first wanted to turn to Less Wrong to see the ideas and comments you might have on this. Original thoughts especially welcome!

To start the conversation, here are some of the features of idealised democracy (the list isn't meant to be exhaustive or restrictive, or necessarily true about real world democracies). Which of these could exist in an Em world, and which should?

EDIT: For clarification purposes, I am not claiming that democracies achieve these goals, or that these are all desirable. They are just ideas to start thinking about.

64 comments

Comments sorted by top scores.

comment by [deleted] · 2013-06-06T15:17:56.218Z · LW(p) · GW(p)

Democracy grants legitimacy to the government.

What? "legitimacy" is a memetic power-device (ie. weapon) to coerce people to psychologically accept the status quo, not a real desirable thing. Democracy only grants legitimacy to government if the relevant powers have been convinced (by use of other memetic tools) that it does.

Democracy is fair and egalitarian - each person has a single vote.

Why is this a good thing? Why do we need to raise power to a sacred status? I have no power over the actions of google, and yet I don't find myself the least bit inconvenienced by it. Further, I actually appreciate google not being a democracy, because it means other idiots can't interfere. Why should government be different? Why in the domain of government do I have to trust the 99% of the population that is less intelligent than me to govern my life?

(note that having equal votes is orthogonal to more directly valuable forms on egalitarianism, like having equal opportunity)

Democracy aligns the interests of the rulers with that of the ruled.

Not really. People often don't know their own interests, and aren't really capable of steering government coherently (see strategic voting, and the game theory (note that it's game theory, not human interest) behind most of the outcomes in democracy)

Further, democracy creates incentive by the powerful (ie media, education system) to "align the interests of the ruled with their own interests" ie manufacture consent. Hence psychological mechanisms like "legitimacy", "rights", and so on.

Democracy allows the competition of governing ideas.

Yup. Though "allows" is a bit weak. It encourages particular forms of disagreement and factionalism.

Democracy often leads to market economies, which generate large wealth.

Selection effects, correlation, causation, etc. Most rich countries are democracies, but most democracies are third world hellholes. Maybe democracy only works in already wealthy civilized countries and elsewhere leads to political violence? (see africa).

Democracy often lead to welfare states, which increase happiness.

I don't know much about this, but again, may be good to examine the exact causal linkage.

Democracy doesn't need to use certain coercive methods, such as restrictions on free speech, that other systems require to remain stable.

Democracy encourages ruling powers to tweak the beliefs of the ruled. Further, only certain forms of free speech are really allowed. You can't say certain things publicly without losing friends or even having internet vigilantes bring the hammer down on you. There are no explicit laws (except in europe), but the coercion is still there.

Democracy stops a particular group from hanging on to power indefinitely, which can reduce corruption, inefficiency and excessive use of state power for private purposes.

Maybe. This could be seen in other lights. Also, the civil service is more or less permanent.

I don't strongly believe these criticisms of democracy, but please don't just take the religion you were raised with and run with it. I little bit of skepticism is appropriate when dealing with such choices (the entire future!). There are other systems that may be easier and more effective, especially with ems thrown in.

The success of democracy is not really based on design, but more luck in that it turned out to work as well as it does in the conditions under which it works. Em's change enough that it may not be effective at all anymore, and not worth fixing.

Democracy is at best a form of theocracy that prevents violent political conflict (when installed correctly, causes it otherwise). "Theocracy" is not necessarily bad, but let's be realistic about how it works (everyone more-or-less religiously agrees on it). Once you see it like this, there are other possible theocracies, possibly better ones.

Replies from: Stuart_Armstrong, DanielLC
comment by Stuart_Armstrong · 2013-06-06T15:49:54.580Z · LW(p) · GW(p)

I don't strongly believe these criticisms of democracy, but please don't just take the religion you were raised with and run with it.

I am not advocating democracy, I'm listing some of the features claimed for it, in order to start people thinking. For the record, I think most of these claims are somewhat true, but only to a weak extent, but that's not relevant to this discussion.

Replies from: None
comment by [deleted] · 2013-06-06T20:04:30.509Z · LW(p) · GW(p)

Ok. Sorry for being non-constructive.

Perhaps the OP question is best framed as "how does the political/government situation change as a result of ems, and what could be done in the domain of government to ensure effective and valuable government", then go a bit more cautiously into "here's some things that people like about democracy" and "here are some other proposals (like em-dictator modified to be non-power grabbing, em dictator unaware of ability to grab power, etc)"

Many (good) things become possible when you can sandbox, copy, and so on that were inconceivable when the problem was how to get wealthy landowners to agree to pay taxes...

I tend to favor the AI-singleton as fast as possible solution to sidestep these issues, but I'll try to actually think about it as well.

comment by DanielLC · 2013-06-06T18:23:08.276Z · LW(p) · GW(p)

Corporations are held in check by the government so they don't actually hurt you. The only thing holding the government in check is democracy. It's far from perfect, but what other options do we have to keep the government from becoming arbitrarily corrupt?

Replies from: Baughn
comment by Baughn · 2013-06-07T16:02:22.548Z · LW(p) · GW(p)

Well, there's the singleton AI approach.

More generally, even at lower technology levels it may be possible to encode much of the functioning of the state in (open-source) software.

Replies from: DanielLC
comment by DanielLC · 2013-06-07T21:12:42.448Z · LW(p) · GW(p)

More generally, even at lower technology levels it may be possible to encode much of the functioning of the state in (open-source) software.

How does that help at all? Whoever controls the program is in charge. Do we decide it democratically? Do we have a dictator write it?

Replies from: Baughn
comment by Baughn · 2013-06-08T02:47:32.943Z · LW(p) · GW(p)

It helps for the same reason that encoding rules in books of law does.

comment by DanArmak · 2013-06-06T16:03:53.026Z · LW(p) · GW(p)

Democracy is a political system used by meat-based, baseline humans, who are all fundamentally very similar. The greatest variation is that between Einstein and the village idiot, but 99% of people are far from either extreme.

One of the key ideas leading to democracy is equal rights. It makes sense to give all humans equal rights because their abilities, needs, and desires are really very similar. But ems inhabit a far greater class of possible behaviors, abilities, needs and desires. It's not clear to me why it even makes sense to consider a democracy of ems.

Another problem is that, as you point out, a one instance-one vote system would create a huge artificial restriction on the creation of legal copies. Creating copies is probably very desirable for anyone rich enough to run them. So the only possible outcomes are either a huge majority of disenfranchised copies created without a license, or else a huge illegal underworld of copies that is brutally repressed by the government because, once a copy is created, they have to grant it full rights.

Just look at the state of modern copyright (and patents) - a huge artificial legal restriction on the proliferation of software. Then imagine copyright applied to sentient beings. Enforcing a system of government licensing of em copying would mean enforcing the non-existence of free software and open hardware, or any software or hardware free from government control.

Replies from: Izeinwinter, NancyLebovitz, Izeinwinter
comment by Izeinwinter · 2013-06-06T17:38:36.951Z · LW(p) · GW(p)

I.. do not think there is any scenario in which allowing unrestricted copying of minds does not end in the apocalypse. It is an open invitation for a paperclipped solar system, only instead of paperclips, all available mass has been turned into instances of Sam Clado from alaska who just really likes himself. It sets off my "BAD IDEA" detectors really, really hard. And then, when I stop and consider it on a less reflexive level? It seems worse. Backups - that is, copies in a frozen/not running state, could be permitted, but giving anyone permission to just manufacture more selves? No. Hell No. If the government wants to outlaw that on pain of pain, I am in favor.

Replies from: Viliam_Bur, DanArmak, ThisSpaceAvailable, Armok_GoB, Stuart_Armstrong, DanielLC
comment by Viliam_Bur · 2013-06-06T20:14:41.308Z · LW(p) · GW(p)

giving anyone permission to just manufacture more selves? No. Hell No.

This seems like a 1000× faster version of one ethnic group reproducing faster than another ethnic group in their neighborhood. And the slow version already makes people kill each other.

Analogically, how about social state? Are we going to guarantee at least some minimum human rights to ems? Because if we do, and if someone is happy to live at the minimum level, what's going to stop them from making as much copies as possible, and letting the government pay or collapse? (Or perhaps illegal copies don't have the same right? Now we have slavery.)

comment by DanArmak · 2013-06-07T09:29:49.804Z · LW(p) · GW(p)

I don't disagree with your analysis. What I'm pointing out is that I haven't seen any workable proposals to prevent this (or another very disagreeable scenario), except for 1) a singleton AI controlling the effective laws of physics in its light cone, or 2) somehow making sure nobody but a single player ("ruler") has the ability to create computer hardware and/or software. In which case, the universe will probably be tiled with that ruler. The incentives to create copies and the resulting evolutionary pressures are too great.

And in the context of all this, ideas like democracy are completely unworkable unless one of these restrictions is implemented.

Replies from: Baughn
comment by Baughn · 2013-06-07T15:59:31.412Z · LW(p) · GW(p)

I'm all for (1). I've yet to see a plausible scenario not involving a singleton AI that isn't, on some level, horrifying.

comment by ThisSpaceAvailable · 2013-06-07T01:45:01.399Z · LW(p) · GW(p)

How does all available mass get turned into Sam Clado, unless there is some physical replicator? And if there's a physical replicator, is it really all that important whether it's replicating to create hardware for Sam Clado, or replicating just to replicate?

Replies from: Luke_A_Somers
comment by Luke_A_Somers · 2013-06-09T11:57:12.585Z · LW(p) · GW(p)

Sam's the one who ordered it to replicate without bound. Others may have different ideas of how much ought to be mined, so it's not a given that that is how things will end up.

comment by Armok_GoB · 2013-06-08T20:30:31.762Z · LW(p) · GW(p)

Any world where I cannot make and interact with copies of myself and/or custom minds forever, is a hell as far as I'm concerned. It being as slow and dangerous as it is currently is bad enough.

comment by Stuart_Armstrong · 2013-06-07T07:30:12.342Z · LW(p) · GW(p)

I agree with your estimation. But there may be equilibriums that don't end so badly, and are more implementable than total restriction...

comment by DanielLC · 2013-06-07T01:19:54.282Z · LW(p) · GW(p)

Preventing the copying of minds strikes me as a bad idea. You can only make seven billion people so happy.

Any idea on how to figure out how many copies to allow?

comment by NancyLebovitz · 2013-06-06T17:15:08.920Z · LW(p) · GW(p)

An alternative might be to grant new copies votes after a some moderate period of time so that they've diverged from the original. This no doubt has its own problems, but it's at least good enough for science fiction.

A requirement to have a percentage of divergence would be too easy to hack.

Replies from: DanielLC, Clippy, DanArmak
comment by DanielLC · 2013-06-07T01:18:05.356Z · LW(p) · GW(p)

In a sense, we do that now. You're free to have children and teach them your values, but they can't vote for 18 years.

Replies from: Stuart_Armstrong
comment by Stuart_Armstrong · 2013-06-07T07:28:32.266Z · LW(p) · GW(p)

Do you think this idea can be generalised to Ems?

Replies from: warbo, Baughn
comment by warbo · 2013-06-13T16:38:49.990Z · LW(p) · GW(p)

We can generalise votes to carry different weights. Starting today, everyone who currently has one vote continues to have one vote. When someone makes a copy (electronic or flesh), their voting power is divided between themselves and the copy. The total amount of voting power is conserved and, assuming that copies default to the political opinion of their prototypes, the political landscape only moves when someone changes their mind.

comment by Baughn · 2013-06-07T16:01:16.920Z · LW(p) · GW(p)

Dubious at best. Ems could be designed to not diverge, and there's evolutionary pressure towards doing so.

Replies from: DanielLC
comment by DanielLC · 2013-06-07T21:07:00.849Z · LW(p) · GW(p)

It would at least keep people from just multiplying themselves right before an election and then merging them again right after.

comment by Clippy · 2013-06-07T23:12:02.211Z · LW(p) · GW(p)

Or maybe when they've been demonstrated to have assimilated the values of the rest of the population.

Replies from: Luke_A_Somers
comment by Luke_A_Somers · 2013-06-09T11:54:52.254Z · LW(p) · GW(p)

No way THAT could go wrong...

Replies from: Clippy
comment by Clippy · 2013-06-15T06:08:10.687Z · LW(p) · GW(p)

There are several modes by which that could fail. For example, if the beings have simply mastered a classifier indistinguishable from a typical population member in polynomial time under an adaptive interactive proof protocol (similar to the so-called "Turing Test"), while actually implementing a (source-code-uninspectable) program hostile to that value system.

comment by DanArmak · 2013-06-07T09:33:47.416Z · LW(p) · GW(p)

Children already have high correlation with their parent's politics (and more so with their parent's religion, caste, etc.). And they tend to act together as families / clans. This will grow far stronger with ems who can design their diverged-copies more precisely than humans can raise their children, and who have much greater incentives to vote as family units (shorter generation time = stronger selection pressure to outbreed other families).

If the government mandates how the children must be different from the parent, with the goal that they vote differently from the parent, that doesn't seem very different from the government just dispensing with voting and setting the policy itself.

comment by Izeinwinter · 2013-06-08T15:40:56.453Z · LW(p) · GW(p)

I have an idea for how to deal with the creation of illegal mind clones that should not result in them cooperating in their own enslavement in and off itself. Create an illegal clone, and the clone gets legal rights. Specifically, yours. You loose them. Property, inherent rights and dignities, titles, ect? all transfer. Of course, that does not help with someone designing an inherently servile mind, but it should be deterrent with some teeth.

Replies from: Armok_GoB, DanArmak
comment by Armok_GoB · 2013-06-08T20:35:18.527Z · LW(p) · GW(p)

There is no difference between "copy" and "original" for EMs, or any reason for the pre-copying Em to care about one more than the other.

comment by DanArmak · 2013-06-08T15:59:41.448Z · LW(p) · GW(p)

So what happens if I create 100 clones before I'm stopped? 1/100th of my property, of a vote, of a social guarantee of a minimal standard of living, may be too little to survive on.

comment by David_Gerard · 2013-06-07T09:39:29.223Z · LW(p) · GW(p)

Yvain notes in a blog post that governmental structures tend to be suited to the technology of the times:

And a lot of these late Neolithic/early Bronze Age cultures turned out the same way. If Ramesses II, Montezuma II, and Agamemmnon went to lunch together, they’d have a lot to talk about, despite being separated by continents and millennia. This suggests that the Generic Bronze Age Government – a god-king served by a bunch of warrior-nobles, plus massive militarism and slavery – probably just made sense given the circumstances.

...

Countries that avoid liberal democracy usually regret it. China would be a good example. They tried being really Communist for a while and ended up becoming an economic basketcase. If they wanted to compete on the international stage they realized they needed a stronger economy, and so liberalized their market.

Thus, democracy as we know it may not in fact be a terminal value of government, but something that works well enough at the time. Or not.

comment by HungryHobo · 2013-06-07T14:20:58.309Z · LW(p) · GW(p)

Pure Democracy seems a poor fit for any system with EM's.

Fortunatly any system with EM's is likely to open up a much better alternative. One of the major problems with most forms of government is that the rulers come from such a different background to the ruled. Democracy or not. They're likely to share few experiences.

But if you can create EM's it's likely you can create composite EM's or trade memories.

Rather than electing a leader why not allow each conscious being to send some subset of their memories to be included in a fabricated representitive. Think your leader doesn't "get" what it's like to live on the breadline or to lose a child? Make your contribution the memories of the experience. Duplicates or similar experiences could be removed or merged.

Duplicate EM's would have few different experiences so no matter how many times you copy yourself you're still only drawing from the same pool of memories and as such wouldn't be able to get more representation than a single person.

Replies from: Stuart_Armstrong
comment by Stuart_Armstrong · 2013-06-07T15:22:12.403Z · LW(p) · GW(p)

Interesting...

comment by Slackson · 2013-06-07T01:09:04.037Z · LW(p) · GW(p)

If an em is running at 10x speed, do they get 10x the voting power, since someone being in power for the next 4 years will be 40 subjective years for them?

One vote for one person already seems suboptimal, given that not everybody has equal decision-making capabilities, or will experience the costs and benefits of a policy to the same degree. Of course, if we started discriminating with voting power incautiously it could easily lead to greater levels of corruption.

Solving the decision-making balance could be done with prediction markets on the effects of different policies, a la futarchy, but that doesn't solve the other part of the problem. If we're assuming prediction markets will be used for policy selection, the "voting on values" part still needs fixing. I don't have any ideas on that, so we're kind of back where we started.

Replies from: Viliam_Bur
comment by Viliam_Bur · 2013-06-07T09:42:42.433Z · LW(p) · GW(p)

If an em is running at 10x speed, do they get 10x the voting power, since someone being in power for the next 4 years will be 40 subjective years for them?

If ems can convert money to speed, this approximately means "more power to rich people". Just saying.

Replies from: Bruno_Coelho, Slackson
comment by Bruno_Coelho · 2013-06-08T16:37:12.315Z · LW(p) · GW(p)

In em scenario we set rich people as the first ems. Don't know how broad this is, but Robin expect a small group of people with lots of copies.

comment by Slackson · 2013-06-07T12:49:30.783Z · LW(p) · GW(p)

Point. I imagine that increased speed will not be the most cost-effective way to turn money into political influence, however. There are plenty of ways to do that already, and unless it's cheaper than other alternatives it won't make much of a difference.

comment by ChristianKl · 2013-06-06T23:30:52.008Z · LW(p) · GW(p)

What do EM's want in the first place? What's stops them from wireheading themselves?

Replies from: DanielLC
comment by DanielLC · 2013-06-07T01:22:01.721Z · LW(p) · GW(p)

Ems are humans, just running on different hardware. They want what humans want. They are stopped from wireheading themselves the same way humans are stopped from getting addicted to drugs. Namely, most of them don't want to be wireheads.

Replies from: ChristianKl, army1987, Locaha
comment by ChristianKl · 2013-06-07T08:07:13.412Z · LW(p) · GW(p)

Having sex is a big drive for humans. I don't see how an Em will do that. Humans are motivated by having physical experiences.

When you start to simulate having sex to an EM I think you are very fast near wireheading.

Replies from: Armok_GoB, DanielLC, Emile, Viliam_Bur
comment by Armok_GoB · 2013-06-08T20:45:34.564Z · LW(p) · GW(p)

Humans seem mainly motivated to read TvTropes and play MMOs. :p

comment by DanielLC · 2013-06-07T21:03:44.671Z · LW(p) · GW(p)

If we were talking about an AI, I'd say you have a point, but an EM was originally a human. Simulating normal human experiences would be expected.

comment by Emile · 2013-06-07T12:04:25.932Z · LW(p) · GW(p)

When you start to simulate having sex to an EM I think you are very fast near wireheading.

Possibly, but note that there are humans that have a lot of available opportunities to have sex, yet prefer to do other things. If a simulated human spent a lot of time having sex, and is given the opportunity to reduce his sex drive, he may want to do so.

(still, I think that wireheading does loom large as things-that-matter here)

Replies from: ChristianKl
comment by ChristianKl · 2013-06-07T12:48:44.436Z · LW(p) · GW(p)

Sex is an obvious example of a physical experience but not the only one. Nearly all pleasureable experiences involve physical perception in some way.

comment by Viliam_Bur · 2013-06-07T09:49:51.556Z · LW(p) · GW(p)

Maybe there will be a rule that if someone is so wireheaded they are unable to answer a simple question within sufficient time limit, euthanasia is legal, and their hardware is given to the remaining ems. That could be a selection pressure against wireheading.

This could be solved by splitting oneself into two personalities, the larger one would get the pleasure, and the smaller one would be responsible to prevent euthanasia. But when the larger personality stops responding, the smaller one could kill them and take their resources. Or the smaller one could try to split to two parts...

Okay, at the end either there is a part that avoided wireheading, so at least that part participates in our reality, or no such part remains and then the whole system can be legally killed.

Another way out would be if 1% of people resisted wireheading for whatever reasons (religious, or program themselves to be unable to accept wireheading), and gradually would take over the world.

Replies from: DanielLC, Armok_GoB
comment by DanielLC · 2013-06-07T21:05:48.273Z · LW(p) · GW(p)

But what if you want to wirehead? Should those of us that like the idea be forever doomed to live a less-than-euphoric existence?

comment by Armok_GoB · 2013-06-08T20:49:05.638Z · LW(p) · GW(p)

you don't have to make a copy of yourself, just code the minimal narrow AI routine needed to answer questions.

comment by A1987dM (army1987) · 2013-06-09T14:19:22.538Z · LW(p) · GW(p)

(I suggest replacing “humans” with “people” -- to me, the former means ‘biologically human’.)

Replies from: DanielLC
comment by DanielLC · 2013-06-09T22:04:10.234Z · LW(p) · GW(p)

For me, "people" just means conscious beings. An AI or an alien could be a person, but they would be psychologically very different from a human, and asking what they want would be a very difficult question.

comment by Locaha · 2013-06-07T09:14:41.516Z · LW(p) · GW(p)

Ems are humans, just running on different hardware.

And light is waves, just not in the ocean.

Seriously, the sentence "human running on different hardware" is meaningless, without a definition of a human.

Replies from: DanielLC
comment by DanielLC · 2013-06-07T21:09:56.960Z · LW(p) · GW(p)

Ems are psychologically human. If you ask how a human would act in situation X vs. how an em would act in situation X, the answer is the same.

Replies from: Barry_Cotter
comment by Barry_Cotter · 2013-06-09T08:00:03.312Z · LW(p) · GW(p)

And? Psychopaths are alien enough for me to have at best minor compunctions about wholesale annihilation and they're a lot closer to me than say, uplifted dogs on most dimensions. Them,I could see communion with. Absent a Singleton a subjective EM century will suffice for things to reach "Nuke them from orbit, it's the only way to be sure."

Replies from: wedrifid, DanielLC
comment by wedrifid · 2013-06-09T16:09:40.184Z · LW(p) · GW(p)

Psychopaths are alien enough for me to have at best minor compunctions about wholesale annihilation

You have at best minor compunctions about wholesale annihilation of an entire group of humans because you are too different to a psychopath? There's some irony!

Note that this site has something of an official policy against discussion of hypothetical violence against identifiable people or groups. As far as I know there is no "except, you know, psychopaths, those @#%#s have it coming" exception in place. Incidentally of all the groups of actual humans one could discuss the wholesale annihilation of, psychopaths are among the least safe target to threaten in practical terms. Psychopaths will tend to have fewer compunctions when it comes to casual execution of pre-emptive self defence measures.

Replies from: Barry_Cotter
comment by Barry_Cotter · 2013-06-10T01:21:52.032Z · LW(p) · GW(p)

If I can play iterated prisoners dilemma with it and our terminal values are not at odds let's trade. I have recent second hand exposure to one psychopath fucking a friend of mine over. It has left me with a much more visceral comprehension of the fact that some people are fucking evil than I had previously. And of conformity and victim blaming.

I'm going to retract the comment as ableist and because as you pointed out genocide is wrong.

A possibly rude question follows. Do you or have you had friendly relations with anyone you recognised was a psychopath? If so how, why?

Replies from: wedrifid
comment by wedrifid · 2013-06-10T04:56:03.895Z · LW(p) · GW(p)

A possibly rude question follows. Do you or have you had friendly relations with anyone you recognised was a psychopath? If so how, why?

A high functioning psychopath, sure. I blame people for what they in fact do, not which parts of their brain are active when they do so. My own visceral comprehension of human evil is such that I don't see much tangible practical difference between high functioning instrumentally rational psychopaths and normal particularly high status humans. In fact since the main difference can be that the psychopath has had to become self aware of their own hypocrisy.

comment by DanielLC · 2013-06-09T22:02:21.086Z · LW(p) · GW(p)

So your saying that values drift will make them less human as they age?

I suspect that either it wont, or it will make them so crazy that just leaving them is hardly an option. Hopefully, they'd develop some way to keep them sane.

Replies from: Barry_Cotter
comment by Barry_Cotter · 2013-06-10T01:30:39.315Z · LW(p) · GW(p)

What's sane? That's socially constructed. What's good at reproducing, at getting more resources? That is not. Absent some mechanism to keep values from drifting ems will experience massive, swift value drift. This will be the case even if they stay within the space of current human neurological diversity, which they won't.

The only way the future isn't a hardscrabble hell is with a friendly Singleton.

Replies from: DanielLC
comment by DanielLC · 2013-06-10T01:54:51.807Z · LW(p) · GW(p)

I'd say absent some mechanism to keep them sane, an old enough EM couldn't even be considered intelligent, and would only be a danger to the rest of us in that it's a resource drain if we can't stomach euthanasia.

We may or may not naturally have such a mechanism.

By "current human neurological diversity" do you mean staying within what we now consider a sane human (in which case values drift will be limited to what a sane human would value), or do you mean that they're not modding their mind?

My opinion is the same for if you mean that EMs are modding themselves, except for the part about that there might be a built-in mechanism for keeping them sane.

comment by Shmi (shminux) · 2013-06-06T15:12:46.368Z · LW(p) · GW(p)

I don't think concentrating on democracy is wise. It's but a particular form of human self-organization that is currently popular, but is far from universally good. Some counterexamples to your naive-seeming political assertions:

Democracy grants legitimacy to the government.

So does monarchy or whatever else people agree is legitimate.

Democracy is fair and egalitarian - each person has a single vote.

Depends on your concept of fair and on what constitutes a person. Certainly many readers of this site cannot vote, yet. And those who can cannot really affect the decisions made by the government, anyway.

Democracy aligns the interests of the rulers with that of the ruled.

Most of your country's poor people will disagree.

Democracy is stable - powerful groups can generally seize power within the structure, rather than overthrowing it.

Applies equally or more so to monarchy

Democracy allows the competition of governing ideas.

Yet the #1 democracy in the world is clearly stuck in the governing rut.

Democracy often leads to market economies, which generate large wealth.

Counterexample: China

Democracy often lead to welfare states, which increase happiness.

Is welfare state a good thing? And whose happiness?

Democracy doesn't need to use certain coercive methods, such as restrictions on free speech, that other systems require to remain stable.

If I recall, US jails more people per capita than many non-democracies.

Democracy stops a particular group from hanging on to power indefinitely, which can reduce corruption, inefficiency and excessive use of state power for private purposes.

First, the real power is behind the scenes and non-elected. Second, Pinochet's Chile, for example, was much less corrupt than the neighboring democracies, if I recall correctly. So there is some other factor in play, probably worth isolating.

Anyway, I think a smarter approach would be to figure out possible forms of self-organization suitable for the Em-world, starting from scratch.

Replies from: Stuart_Armstrong
comment by Stuart_Armstrong · 2013-06-06T15:44:28.494Z · LW(p) · GW(p)

Some counterexamples to your naive-seeming political assertions

I am not claiming that these are all true (most are true to a certain extent, as far as I can tell - but that's not relevant here), simply that these are features often believed about democracies, and are good starting points to think about.

Anyway, I think a smarter approach would be to figure out possible forms of self-organization suitable for the Em-world, starting from scratch.

I think an analysis of what kinds of democracies work or don't work for Ems is a first step before the designing from scratch (we can also toss in a few other current models of governments), as this will help isolate the key features of the Em world. This is often better than starting from scratch, as it reduces premature commitment to a fantastic-sounding idea.

Replies from: shminux
comment by Shmi (shminux) · 2013-06-06T20:22:12.039Z · LW(p) · GW(p)

This is often better than starting from scratch, as it reduces premature commitment to a fantastic-sounding idea.

Yes, there is a danger of that, certainly. On the other hand, if you start from what works for meat beings then there is a danger of being stuck in a local optimum. Probably both ought to be explored, and I am not sure if one or the other ought to have precedence. My personal opinion, admittedly not rigorously quantified, is that failure of imagination is a worse sin than reinventing the basics when dealing with predictions. But then in my physics studies I learned the importance of always being able to rederive the conclusions from the first principles, not just from half-processed expressions with potentially a lot of hidden or forgotten assumptions built in.

comment by Lumifer · 2013-06-06T18:33:09.047Z · LW(p) · GW(p)

...democracy?

/flicks the OFF switch.

(to be a bit more clear, political systems (such as democracy) are about power. Without being explicit about what power ems will have, specifically in the meatworld, the question seems too ill-defined to me)

Replies from: Vratko_Polak
comment by Vratko_Polak · 2013-06-10T06:18:45.554Z · LW(p) · GW(p)

political systems (such as democracy) are about power.

Precisely. Democracy allows the competition of governing ideas. Granting legitimacy to the winner (to became government) and making system stable.

I see idea of democracy in detecting power shifts without open conflicts. How many fighters would this party have if civil war erupted? Election will show. Number of votes may be very far from actual power (e.g. military strength) but it still can make the weaker side to not seek conflict anymore.

Without being explicit about what power ems will have, specifically in the meatworld, the question seems too ill-defined to me

Well, I am not even sure about powers of individual humans today. But I am sure that counting adult = 1 vote, adolescent = 0 votes is not precise. On the other hand, it does not need to be precise. Every form of power can be roughly transformed to "ability to campaign for more votes". Making votes more sophisticated would add a subgoal of "increasing voting power" that could become as taxing as actual conflict. Or not, I really have no idea; sociology is difficult.

Back on topic. I see problems when ems are more varied in personal power, compared to children vs adults variance of today. Would "voting weight" have do be more fine-grained? Would this weight be measured in a friendly competition, akin to sports of today? Or would there be privileged caste, and everyone else would have no voting rights? Would the voting rights be not granted for persons, but for military platforms instead? (Those platforms would not be actually used, they will exist just for signalling purposes.) Or will any simpleton barely managing digital signature be a voter? Subject to brain-washing by those with actual power?

I hope that these low-quality questions can help someone else to give high-quality answers.

But I want to stress that I do not see any problems specific to copy-ability of ems. Democracy only measures power of political party, democracy does not reflect on which methods have lead to the said power.