As an upload, would you join the society of full telepaths/empaths?
post by Shmi (shminux) · 2013-10-15T20:59:30.879Z · LW · GW · Legacy · 120 commentsContents
120 comments
I asked this question on IRC before and got some surprising answers.
Suppose, for the sake of argument, you get cryo-preserved and eventually wake up as an upload. Maybe meat->sim transfer ends up being much easier than sim->meat or meat->meat, or something. Further suppose that you are not particularly averse to a digital-only existence, at least not enough to specifically prohibit reviving you if this is the only option. Yet further suppose that sim-you is identical to meat-you for all purposes that meat-you cared about (including all your hidden desires and character faults). Let's also preemptively assume that any other attempts to fight this hypothetical have been satisfactorily resolved, just to get this out of the way.
Now, in the "real world", or at least in the simulation level we are at, there is no evidence that telepathy of any kind exists or is even possible. However, in the sim-world there is no technological reason it cannot be implemented in some way, for just thoughts, or just feelings, or both. There is a lot to be said for having this kind of connection between people (or sims). It gets rid of or marginalizes deception, status games, mis-communication-based biases and fallacies. On the other hand, your privacy disappears completely and so do any advantages over others the meat-you might want to retain in the digital world. And what you perceive as your faults are out there for everyone to see and feel.
As a new upload, you are informed that many "people" decided to get integrated into the telepathic society and appear to be happy about it, with few, if any, defections. There is also the group of those who opted out, and it looks basically like your "normal" mundane human society. There is only a limited and strictly monitored interaction between the two worlds to prevent exploitation/manipulation.
Would you choose to get fully integrated or stay as human-like as possible? Feel free to suggest any other alternative (suicide, start a partially integrated society, etc.).
P.S. This topic has been rather extensively covered in science fiction, but I could not find a quality online discussion anywhere.
120 comments
Comments sorted by top scores.
comment by Lumifer · 2013-10-16T00:39:59.527Z · LW(p) · GW(p)
I really would like more information before committing.
In particular, I would be interested in power structures in that new society. Who decides things? Who runs things? What happens to the malcontents? How does propaganda and various ideological manipulation work?
The obvious reason is that as an entirely open empath/telepath you cannot hide anything from the authorities.
As an aside, I would also expect normal unfrozen/uploaded people who chose the telepathic society to be severely stressed, at least in the beginning.
Replies from: None, Strange7↑ comment by [deleted] · 2013-10-16T14:49:01.442Z · LW(p) · GW(p)
I second that I want to gather information and might ask as a priority question can I find out if any of the the friends and family, or the people who I currently know, are in either society? Can I find out what their reasoning in choosing that society was? How have their interactions been with family members and friends who chose the other society (if any) through the monitored communication lines?
For instance, if the answer is 'Everyone you know is dead and their descendants(if any) don't find an old fogey like you particularly relevant, so none of your above questions grant you any information.', I might pick the the telepathic society, because the loneliness of current society minus all current social contacts seems incredibly depressing.
But there are so many answers I can potentially see to those questions I find it difficult to think of my responses to any particular scenario without narrowing down the question and I'm not sure what I should consider the Unfought Hypothetical baseline.
↑ comment by Strange7 · 2014-01-28T02:23:53.659Z · LW(p) · GW(p)
The authorities also cannot hide anything from the general public. Police brutality depends on the fact that police are authorized to use force under certain circumstances, and have some latitude in determining when those circumstances apply. If the exact rationale for escalating a given conflict was a matter of public record, it would become much more difficult for such behavior to persist.
comment by Slider · 2013-10-16T19:44:26.954Z · LW(p) · GW(p)
Even with telepathy status games and indirect hiding might still happen. And it depends also what kind of access. If everybody feels every thought of everybody else it could easily be overwhelming and pure cognitive economy could favour opting out.
If you can choose to access at will anybodys mental state you can adopt strategies to people not opt to do so in critical moments. If you build up yourself a reputation for violent thoughts people might want to try to avoid your mind out of comfort granting you a degree of privacy (similar effects for boringness etc). You could also attach meanings to your thoughts that other people would not associate in essence encrypting your mind so that they can't natively read it in full clarity (even if they have full cryptotext access) (may or may not be synonymous with going mad).
If everybody has intimate psychological contact your mind could form dependencies to parts of other minds without which you could not psychologically function. So people might want to opt out for wanting to remain individuals and not fuse with the hive mind. Even if only others formed such dependencies could be bothersome because you don't only think for yourself but potentially for others as well.
It seems to open a can of worms that would need to be dealt somehow and those solutions would be far more important to opting than whether it would be cool.
comment by knb · 2013-10-17T03:14:16.353Z · LW(p) · GW(p)
In the real world, this would lead to tremendous efforts to develop the ability to crimestop.
The mind should develop a blind spot whenever a dangerous thought presented itself. The process should be automatic, instinctive. Crimestop, they called it in Newspeak. He set to work to exercise himself in crimestop. He presented himself with propositions -- 'the Party says the earth is flat', 'the party says that ice is heavier than water' -- and trained himself in not seeing or not understanding the arguments that contradicted them.
D.H. Lawrence did a better job than I ever could of describing my personal objection to living in the society of universal surveillance:
"That I am I.
That my soul is a dark forest.
That my known self will never be more than a little clearing in the forest.
That gods, strange gods, come forth from the forest into the clearing of my known self, and then go back.
That I must have the courage to let them come and go.
That I will never let mankind put anything over me, but that I will try always to recognize and submit to the gods in me and the gods in other men and women."
comment by WalterL · 2013-10-17T15:50:43.625Z · LW(p) · GW(p)
I'd join up for Team Overshare. I expect that the version of myself that I recognize as myself would not long survive full disclosure. I'd probably end up changing to adapt, but I don't see anything terrible about what I'd change into (probably a person who holds the opinions which are the mean of the mass mind's values).
comment by AngryParsley · 2013-10-16T08:35:15.669Z · LW(p) · GW(p)
I defy your assertion that both societies are similarly happy. Unless the telepath society is extremely accepting of fringe thoughts, it's going to be worse. Knowing that others will read your thoughts and judge you for them causes you to censor yourself. But at that point, it's already too late. People will know that you thought of something objectionable and suppressed it out of fear of judgement.
Really though, the two options are silly. Ems allow for so many more possibilities. A society in which people could voluntarily expose their thoughts would have quite a few advantages. Ditto for a society with perfect (optional, voluntary) lie detection.
Replies from: mwengler, shminux↑ comment by mwengler · 2013-10-20T15:58:22.343Z · LW(p) · GW(p)
Once fringe thoughts are visible, our conception of what is human, what is acceptable, expands a lot. If there's one thing that internet searches and rule 34 have taught me is that there are a lot more people on similar fringes to the one I am on than I thought as a young adult (in the pre-internet days of the 1980s).
What is talking and mirror neurons and empathy other than an expression of the value to the species of having a hive mind? When evolution finds something valuable, like sexual reproduction for instance, it seems to build in to the organisms powerful drives to make that happen, the satisfaction of which can be very, well, satisfying to the organism. I would take my fringey mind and take my chances on deep satisfaction, especially if there is an opt out choice as implied in the original post.
Replies from: Moss_Piglet↑ comment by Moss_Piglet · 2013-10-20T18:11:14.372Z · LW(p) · GW(p)
What is talking and mirror neurons and empathy other than an expression of the value to the species of having a hive mind?
In terms of human evolution the big hypothesis I've seen for our extreme language skills and empathy/sympathy is actually deception, which makes a twisted sort of sense; even a dog can communicate well enough to coordinate complex hunting tactics and function socially in a pack, but lying in any kind of sophisticated way means accurately predicting someone else's knowledge and emotional state all while 'spinning' accurate information about your environment into deceptive statements. Especially since so much of our deception is automatic signaling behaviors, it wouldn't be surprising if we still used most of our brainpower just on tricking each other.
I'm not sure a telepathic society would be a dystopian hellscape, it actually sounds like it might be kind of relaxing if you're not particularly deviant, but it's absolutely not a natural extension of human nature.
Replies from: mwengler↑ comment by mwengler · 2013-10-21T14:52:17.210Z · LW(p) · GW(p)
In the face of the evidence, I find it implausible that deception is anything even close to 50% of our brain power devoted to language. My reasons:
1) A cellphone must require 100's of thousands of pages of documentation, not to mention a million lines of code or something like that. Deception is completely useless, even counterproductive in writing code, you HAVE to get that right in order for it to even get to the point where you can even start to debug it, let alone improve it. THe documentation for building a cell phone also must contain a tremendous amount of correct information if the incredibly complex phone is to actually have a prayer of working when it is put together.
2) I don't believe you can study the history of philosophy and the sciences without marvelling at the extent to which brilliant humans extracted truth from confusion, and reported it.
3) Even in sex and seduction, where there can be deception, ( of course I love you, let me just put the tip in...) the bulk of the signalling is pretty accurate, with the more desirable women generally ending up with the more desirable men.
4) the person being deceived, on average, must derive more value from communication than harm or else communication would have been bred out of the species. That is, people who couldn't communicate would have had an advantage due to their immunity from deception which was more than their disadvantage from not being able to coordinate. That language continues to exist and appears to be if anything wildly more valuable now than it was when it first evolved to me makes it clear that the engine is good information and yes, there are lots of deceptive strategies that thrive, but only as long as they are LESS successful than the underlying engine.
It is incredibly valuable to understand how systems are exploited in devious ways. But I think it is important to keep in context that deviousness can never exceed value creation in any kind of equilibrium.
Replies from: Strange7↑ comment by Strange7 · 2014-01-28T02:04:38.210Z · LW(p) · GW(p)
The engineers working on various parts of the phone need to be profoundly honest in technical matters, yes. What about the middle-managers? Sales and advertising? A CEO attempting to justify his latest increases in pay to the board of directors?
Replies from: mwengler↑ comment by mwengler · 2014-01-28T10:14:00.411Z · LW(p) · GW(p)
My overriding point is that for communications to have evolved, it had to be providing much more truth than falsehood. My thinking is the net value communication provides the group comes from true communications, and false communications are a corruption, a way for individuals to take an excess share of value, to hurt the group even more than they themselves are benefited.
But if ever the utility of communication for taking from the group ever came even close in negative value to the group as its utility in providing to the group, it would have been selected away. Meanwhile, we live in an era of unprecedented productivity on almost every scale, all brought about by humans working in groups, so there can really be no doubt at all that currently, communication provides WAY more positive value to the group than opportunity for stealing/cheating to individuals. You can just see the net wealth piled up all over the place.
Replies from: Strange7↑ comment by Strange7 · 2014-02-03T12:04:22.497Z · LW(p) · GW(p)
Antlers aren't mutually beneficial.
Replies from: tut, mwengler↑ comment by mwengler · 2014-02-03T16:28:37.700Z · LW(p) · GW(p)
Antlers aren't mutually beneficial.
Perhaps, perhaps not. But they are truthful.
Replies from: Strange7↑ comment by Strange7 · 2014-02-03T18:17:07.393Z · LW(p) · GW(p)
Antlers are a mechanism by which bucks compete, injuring each other. Growing the antlers in the first place is a significant metabolic cost. This is selected for, even though it's not remotely a net benefit for the group in itself, because it allows stronger individuals to exclude weaker ones from mating. Is it so inconceivable that a communication channel full of mostly lies could serve some similar function, and thereby persist despite the obvious societal costs?
Replies from: mwengler↑ comment by mwengler · 2014-02-03T21:45:47.047Z · LW(p) · GW(p)
Intriguing ideas. I'm having to do mental gymnastics to think about the metaphor between antlers and communication.
Your comment does make me realize I have implicitly assumed that taking of resources through deceitful communication must be bad for the group, but this is not necessarily true. Perhaps bringing resources to the most clever liars does raise the groups fitness in a variety of plausible ways.
The kind of antler analogy I think of is if you had some very clever but not very strong moose who learned how to stick a log in his antlers making them more effective than they were without the log, and thus got to mate instead of a more physically fit moose mating. It is quite possible that this cleverness only helps him get laid, and does not trump the lack of strength he passes on to his offspring in terms of avoiding predators, and so the moose population is weakened by such cleverness: tricking girl mooses into sleeping with your gimpy self just makes mooses gimpier. But of course that kind of cleverness might also help a moose against predators, it would be up to the environment to decide that one, and so clever lying mooses (meese?) would just be another variation coming out to be tested in the population. Sexual selection always results in some variations with an obvious cost to the group winning over their alternative which is obviously better for the group: presumably sexual selection wins out because it speeds the process of variations being tested, you look at the world and all the non-sexual reproducers are SO MUCH SIMPLER than sexually selected.
In any case if I were to seriously pursue my truth/useful falsehood/parasitic ideas in evolution of language I would need to rethink them fleshing them out in light of these antler-induced thoughts. Thanks!
Mike
Replies from: Strange7↑ comment by Strange7 · 2014-02-04T16:35:49.863Z · LW(p) · GW(p)
You're quite welcome.
However, you seem to be overthinking the metaphor. An "honest" moose lacks antlers altogether; the metabolic cost of synthesizing antler-material is the cognitive cost of developing a coherent "story," antler-related musculature is the social skill necessary to "sell" it, exhaustion and injuries inflicted by mating duels are the externalities. With no mating duels and the same resource inputs, an antler-less moose could be better at every other aspect of being a moose, including gathering additional food and running away from predators or trampling them to death, but would have no pressure to improve beyond subsistence.
Replies from: mwengler↑ comment by mwengler · 2014-02-05T23:12:22.351Z · LW(p) · GW(p)
This is selected for, even though it's not remotely a net benefit for the group in itself, because it allows stronger individuals to exclude weaker ones from mating.
The evidence is overwhelming that sexual selection results in species that are better adapted to live in our world. The evidence is overwhelming that fighting or otherwise competing for mates is a winning strategy for a broad range of species. What is that overwhelming evidence? It is that the species that exist carry this feature to a very large extent. A moose population or subpopulation could easily have sampled less selective mating, it is unlikely that that natural experiment has not been done probably many times during the species lifetime of antlered competitors. And what is the result of those experiments? 10 point bucks!
Sexual selection is unambiguously good for the group according to the evidence, all the natural experiments out there that not only still carry antlers, but carry pretty big racks.
Just as central planning is better than free economies "in theory," but actually it is only better if you ignore important aspects of reality, non-sexually-competitive meese are only better than the antlered "in theory" where the theory ignores the apparently overwhelming value of sexual selection.
As to exactly how best to torture the metaphor, I think I have abandoned the idea that lying, gaming the system, must occur at a lower rate than does direct communication in order for their to be survival value in communication. I think that reflected a prejudice that lying is "bad" and truth and building things is "good." A prejudice I try to avoid and I think I had been failing. Now that lying is just another adaptation to consider, the complexity of the system could well wind up with a communication system filled with nothing but lies which still served the group. If in no other way, it could be a way for the clever to successfully compete against the beautiful for mates.
↑ comment by Shmi (shminux) · 2013-10-16T15:08:31.063Z · LW(p) · GW(p)
I defy your assertion that both societies are similarly happy.
I never asserted that. All I said is that telepaths appear to be happy, and the rest appear to be "normal".
Ems allow for so many more possibilities.
Indeed. I was simply trying to come up with a scenario where there is an option to live in a "thoughts and feelings set on public" society, and sims seem like a decent model of it.
comment by Alicorn · 2013-10-16T02:41:28.897Z · LW(p) · GW(p)
What kind of exploitation/manipulation is being prevented by limiting and "monitoring" interaction, exactly?
Replies from: Manfred, shminux↑ comment by Shmi (shminux) · 2013-10-16T06:16:44.660Z · LW(p) · GW(p)
What Manfred suggested, or the opposite, the mundanes with their thoughts and feelings set on "private" taking advantage of the freely sharing telepaths with no immunity against hidden agendas.
comment by JoshuaZ · 2013-10-15T21:10:18.005Z · LW(p) · GW(p)
I'm not sure this should get its own topic but seems like it should be in the open thread. It seems more like general transhumanist speculation than rationality focused.
I suspect that any world I wake where we've advanced to the point where we have sims will likely be very different from our current one, to the point where issues I can't anticipate now could easily impact the answer to this question.
Replies from: shminux↑ comment by Shmi (shminux) · 2013-10-15T21:25:33.040Z · LW(p) · GW(p)
I'm not sure this should get its own topic but seems like it should be in the open thread. It seems more like general transhumanist speculation than rationality focused.
I don't disagree. Unfortunately the open thread is ill-suited to longer posts and has limited visibility, at least in my experience.
comment by Luke_A_Somers · 2013-10-17T18:26:54.070Z · LW(p) · GW(p)
It all depends what other interactions are possible besides mind-reading. Depends on whether the telepaths can communicate with normals, what everyone spends their days doing, whether someone can get kicked out of telepath-land. Also depends on details of the telepathy - whether telepaths can format their thoughts for easy browsing, how powerful mind searches are, etc.
Not fighting, just indicating dependencies.
comment by smk · 2013-10-16T08:14:37.063Z · LW(p) · GW(p)
I probably would not join, but I would try to research it to figure out why people who join usually like it. Depending on what I learned, I could change my mind.
What I would prefer is to have the option of sending/receiving thoughts/emotions/memories when and with whom I choose, with consent of those involved. Other mental abilities would of course have to be implemented as well, to allow this kind of telepathy to be manageable.
comment by ChristianKl · 2013-10-15T21:48:41.018Z · LW(p) · GW(p)
I don't think it's easy to define what it means for one neural net to be directly interfaced with another neural net. Even in todays world I can imagine technology that scans a person emotion and displays them to other people.
There is also the group of those who opted out, and it looks basically like your "normal" mundane human society.
I think the idea that a society of upload will look like a "normal" mundane human society is severely misguided. Not having a body changes much about society.
Replies from: Ishaan↑ comment by Ishaan · 2013-10-16T03:27:11.879Z · LW(p) · GW(p)
Not having a body
You'll have to simulate the body and its environment to some extent anyway, because it's the only interface brains really understand. Imagine how terrified many people would be if 1) their last memory involved imminent death 2) they find that they can think but they can't see, feel, or hear anything? And the long term effects of sensory deprivation are not fun. That's without even getting into phantom pains and other super important things like the off-brain portions of the neuroendocrine system (for the most obvious example, spaying/neutering alters much more than just sexual behavior). The brain is not the only thing doing information processing, it's just the main hub.
If you can simulate brains, there is no reason you can't simulate the rest more or less faithfully if you choose. Or for true realism, make a sensory-motor robot body and wirelessly link it up to the brain-hosting server. If you don't do so, you have to alter brain function to make the absence of body-environment acceptable, which generally seems less favorable.
Replies from: ChristianKl↑ comment by ChristianKl · 2013-10-16T13:18:24.503Z · LW(p) · GW(p)
You'll have to simulate the body and its environment to some extent anyway, because it's the only interface brains really understand.
Yes, but I would assume that the simulation doesn't try to mimic physical reality 100%. People probably will be able to teleport around instead of taking a 10 minute walk.
Engaging in hobbies like martial arts or dancing will be much different than it is at the moment.
Look at woman wanting to keep a certain weight to be beautiful to attract a good mate. Look at how guys go to the fitness studio to build big muscles to attract woman. All those physical effects are important for the way our society is made up.
comment by Eugene · 2013-10-24T09:23:04.806Z · LW(p) · GW(p)
Not only would I decline the invitation, I would be extremely suspicious of the fact that very few have defected, and also extremely suspicious of those who have. What you're describing goes beyond telepathy. It's effectively one mind with many personalities. I could never trust any guarantee of safe passage through such a place. It would be trivial for a collective mind to rob a single mind of choice, then convince it that it made that choice. It would also be slightly less trivial but still plausible for a collective to convince that mind - and fool other independent minds - into believing that it chose to defect when it actually didn't.
On the other hand, if after observing the political landscape of the time period I came to realize that this entity is clearly taking over, then I would jump on board as a self-preserving strategy, knowing that at some point the non-connected independent minds would become marginalized enough to feel threatened and lash out violently, at which point the faster-thinking collective mind wins the fight. Being caught in a collective is less horrible than being caught in the crossfire.
comment by mwengler · 2013-10-20T15:40:44.977Z · LW(p) · GW(p)
With an opt-out possibility I would give it a try. I suspect that I am suboptimally secretive and ashamed of a lot of my thoughts and feelings and have a "natural" instinct to lie and shield myself from being known accurately. I'd like to try a society that seemed to be working for others even if I didn't stick with it. It reminds me of what I did in Second Life years ago. I entered as a woman with powerful secondary sexual characteristics. After a guy more or less fell in love with me, I spent about two days on that, talking with him while he tried to seduce me, and then blew my own cover on purpose. My sense is I learned a tremendous amount about the interaction between men and women by doing that, understanding how attractive women are driven to be aloof and superior.
I'd like to learn more about my own secretiveness and shame as I am positive that very little of it is in my conscious mind at this point.
Replies from: shminux↑ comment by Shmi (shminux) · 2013-10-20T17:11:08.418Z · LW(p) · GW(p)
have a "natural" instinct to lie and shield myself from being known accurately
Right. When this is not an option you might feel liberated and like it, who knows.
comment by chaosmage · 2013-10-16T10:35:23.899Z · LW(p) · GW(p)
With full telepathy, the cognitive illusion of "one self per body" would shatter. To join a telepathic society would be to melt into a hive mind. And that hive mind wouldn't regret the melting of its individual members any more than our bodies regret the loss of individuality of its cells.
Replies from: Viliam_Bur, shminux↑ comment by Viliam_Bur · 2013-10-16T14:22:56.038Z · LW(p) · GW(p)
Are the members of the hive mind analogical to the unthinking cells, or rather to different subsystems of the brain which sometimes do have conflicting goals?
Even if the individualities are lost, there could still be clusters with mutual disagreement. A part of the hive mind wanting to expore the outside world, other part desiring more wireheading...
↑ comment by Shmi (shminux) · 2013-10-16T15:05:04.349Z · LW(p) · GW(p)
I did not suggest shared sensory experiences, precisely to avoid this mind meld. You are also not forced to read other people's thoughts and feelings.
Replies from: chaosmage↑ comment by chaosmage · 2013-10-16T15:45:04.606Z · LW(p) · GW(p)
Sensory experiences cannot be clearly distinguished from "thoughts, or just feelings, or both".
You are also not forced to read other people's thoughts and feelings.
I'm not forced to use the internet, either. I'll just be outcompeted if I don't.
Replies from: shminux↑ comment by Shmi (shminux) · 2013-10-16T16:22:32.796Z · LW(p) · GW(p)
Sensory experiences cannot be clearly distinguished from "thoughts, or just feelings, or both".
I don't think that's quite true. There is some overlap, sure. I imagine reading thoughts like reading a book about something (or maybe watching a movie) rather than actually doing that something. You can be quite immersed, but still not confuse your own thoughts with those on screen.
comment by Viliam_Bur · 2013-10-16T07:48:27.203Z · LW(p) · GW(p)
However, in the sim-world there is no technological reason it cannot be implemented in some way, for just thoughts, or just feelings, or both.
The technical details of the implementation may be critical. How specifically are the thoughts shared.
Right at this moment I am commenting on LW from my work. My colleague in the same room has been loudly speaking on the phone for the last half hour, so I am completely unable to focus on any work anyway. Hardly able to focus even on reading LW. -- If telepathy would be anything like this, I would avoid it as hell.
Loss of privacy is one aspect, loss of silence (which I consider critical for my sanity) would be another one.
Replies from: shminux↑ comment by Shmi (shminux) · 2013-10-16T15:10:54.653Z · LW(p) · GW(p)
How specifically are the thoughts shared.
Facebook-like: your thoughts and/or feelings are set on "public", but you are not forced to read everyone's timeline. So, loss of privacy: yes. Loss of silence: no.
comment by kalium · 2013-10-16T06:41:24.730Z · LW(p) · GW(p)
I expect I would opt out. I cannot feel completely free to do anything that many people can easily observe. I become deeply uncomfortable if someone looks over my shoulder while I am reading, and I am simply not able to listen to music while someone else is in the same room. I expect that if I lived in a fully telepathic society I would feel forced to try to stop thinking entirely. Not sure how exactly that would end, but I expect not well.
Aside from matters of my own personal comfort, like Lumifer I would be very concerned about the power structures in this society. The worst case is so vastly worse than the worst case in a non-telepathic society. And if contact is so limited I have little reason to believe the claim that most people who opt in are truly satisfied with their choice.
Replies from: shminux↑ comment by Shmi (shminux) · 2013-10-16T06:53:24.130Z · LW(p) · GW(p)
I become deeply uncomfortable if someone looks over my shoulder while I am reading, and I am simply not able to listen to music while someone else is in the same room.
I am curious, can you unpack the reasons for this a bit?
Replies from: kalium↑ comment by kalium · 2013-10-16T22:03:12.793Z · LW(p) · GW(p)
I can speculate, but there's no particular reason to think I know the "true" reason.
My ability to logically defend my preferences is not strongly related to their importance to me. So if I particularly like or am emotionally affected by some piece or genre of music and someone notices me listening to it and argues with me or explains why I am wrong to like it, I may find that when I listen to it later I only feel wrong and cannot enjoy it anymore.
If I am reading material that is not interesting enough, I may be judged unintelligent as a result. If I am reading material that is interesting enough, I may be expected or demanded to produce insightful commentary, and when I fail (which I will---if you put me on the spot you're lucky to get audible words out of me let alone anything sensible) I may be judged both unintelligent and unaware of it since I was obviously reading material beyond my comprehension and couldn't even tell that this was so.
My value system is under construction and not especially stable, and if my reading material is relevant to this I risk failing to defend my values and being forced to admit the superiority of the value system of whoever happened to catch me reading. I don't want to end up in the position of, say, admitting that utilitarianism is superior and, since you're whinier, you obviously care more about X than I do and therefore I must go along with your position on X. (Sounds like a strawman, but nope. I've actually been argued into that one before. I seem to have a particularly weak will.) Therefore I try to avoid anything that might lead to discussions relating to my value system. However if I only hide my reading when it's related to my value system, that's practically telling you when to harass me, so I have to hide all my reading in order not to give away that information.
In a telepathic society I would feel required to restrict my thoughts to ones I could defend in an argument, which in practice means I could not get away with developing any new thoughts unless they spring into place fully formed.
Replies from: NancyLebovitz↑ comment by NancyLebovitz · 2013-10-16T23:13:05.132Z · LW(p) · GW(p)
That all makes sense. However, in an ideal telepathic society, people would learn about those who need time to think and space to appreciate what you like, and you'd be cut slack and not be argued with about things that aren't urgent.
For that matter, people who like to argue would always have someone available to argue with.
Replies from: kalium, Viliam_Bur↑ comment by kalium · 2013-10-17T00:48:13.137Z · LW(p) · GW(p)
Your model of a typical human mind seems fundamentally more charitable than mine.
Replies from: Luke_A_Somers↑ comment by Luke_A_Somers · 2013-10-17T18:19:29.934Z · LW(p) · GW(p)
If you're telepathic, you can feel their pain as your own. That's a game-changer.
Replies from: Lumifer, kalium↑ comment by Lumifer · 2013-10-18T00:16:42.390Z · LW(p) · GW(p)
Public torture of undesirables pour encourager les autres becomes a REALLY effective technique, don't you think?
Replies from: Luke_A_Somers↑ comment by Luke_A_Somers · 2013-10-18T16:32:59.352Z · LW(p) · GW(p)
I was thinking more along the lines of 'Oh shoot. I really hurt them, didn't I?'
Replies from: Lumifer↑ comment by Lumifer · 2013-10-18T16:52:43.215Z · LW(p) · GW(p)
Yes, I know.
However another likely thought would be "Good, it *should* hurt, you're a bad person".
Replies from: Luke_A_Somers↑ comment by Luke_A_Somers · 2013-10-18T17:34:09.680Z · LW(p) · GW(p)
Maybe. However, we have been told that this society is said to be happy and not devolved into a shit-flinging match, I suspect that the way it usually works out is more benign. Or they kill/expel the infidels and our happiness sample is biased.
Replies from: Lumifer↑ comment by Lumifer · 2013-10-18T18:28:26.932Z · LW(p) · GW(p)
we have been told that this society is said to be happy and not devolved into a shit-flinging match
Leaving aside the credibility issues of such a claim, I can easily imagine a happy society where most people feel good about themselves and each Sunday they all gather in the pubic square to cheer at the burning of the witches who in their evil little hearts doubted the greatness and the benevolence of X. Pick X to suit.
↑ comment by Viliam_Bur · 2013-10-17T06:55:22.514Z · LW(p) · GW(p)
Just as normal people play a lot of signalling games, the telepathic society would probably invent a new layer of them. (Which does not prove that those games would be worse than those we have now.)
For example, people who need more time to think may be given more space and at the same time could be perceived as e.g. less intelligent -- just how we would automatically feel now about slow-speaking people.
comment by Manfred · 2013-10-16T03:27:21.484Z · LW(p) · GW(p)
I'd want to try the telepathic society before deciding, but would probably opt out. I like games, and I like some game-like social structures, and acting in ignorance of the other person's plans is an important ingredient in games. If I specifically want someone to know that I'm thinking, then we have a variant of telepathy called "talking" that I can employ.
Replies from: shminux↑ comment by Shmi (shminux) · 2013-10-16T06:18:58.541Z · LW(p) · GW(p)
For the duration of the game, you can voluntarily not read other people's thoughts/feelings, just like you voluntarily don't peek into other players' hands in a card game. Well, at least I don't.
Replies from: Manfredcomment by TheOtherDave · 2013-10-15T23:29:00.036Z · LW(p) · GW(p)
If you're serious about the opt-outs looking like my current society, I go for the telepathic society in a heartbeat. Something is very very wrong with the opt-outs, because that just ought not be happening.
If you just mean that the opt-outs are an unmarked case, I want to explore the opt-out society for a while, and I might decide to stay there, depending on what it's like... though the telepathic society is still tempting.
Replies from: shminux↑ comment by Shmi (shminux) · 2013-10-16T00:15:07.189Z · LW(p) · GW(p)
Something is very very wrong with the opt-outs, because that just ought not be happening.
I don't follow, why not?
Replies from: TheOtherDave↑ comment by TheOtherDave · 2013-10-16T00:27:11.142Z · LW(p) · GW(p)
Given all of the degrees of freedom that the scenario implies, these guys chose to replicate early-21st-century society with all their limitations? That suggests either a near-pathological lack of creativity, or a degree of attachment to my current society that feels unhealthy to me.
Admittedly, it's just a prejudice on my part, but you asked what I would do...
Replies from: kalium↑ comment by kalium · 2013-10-16T00:52:32.531Z · LW(p) · GW(p)
"normal mundane human society" has looked like a lot of different things. OP never specified a particular version.
Replies from: TheOtherDave↑ comment by TheOtherDave · 2013-10-16T01:41:54.498Z · LW(p) · GW(p)
I took "your [..] society" to mean my society, but this is also why I gave two answers, depending on whether it actually meant my society or not.
comment by DanielLC · 2013-10-15T22:41:44.924Z · LW(p) · GW(p)
I don't see any reason why it would be easy to implement in the sim world. There can be apps that broadcast your thoughts, or even communication standards that require you to broadcast your thoughts, but if you don't give away your thoughts, and there's no exploiting security holes, then there's no telepathy.
I think I would be fine with telepathy, so long as I can preserve anonymity. That is, if I don't like what happened at the telepathic meeting, nobody can trace that to my non-telepathic identity.
Replies from: shminux↑ comment by Shmi (shminux) · 2013-10-15T23:31:05.043Z · LW(p) · GW(p)
Sorry, I wasn't clear. To use a Facebook analogy, by opting in you designate your thoughts/feelings as public. So does everyone else in the opt-in society. Anyone can access anyone's inner state, and possibly its history, at any time they wish. There is no option to hide anything unless you opt out.
Replies from: knb, DanielLC, BaconServ↑ comment by knb · 2013-10-16T02:32:53.079Z · LW(p) · GW(p)
I would be amazed if any significant percentage of people opted in. I find the idea horrifying, like living in the 1984 world.
Replies from: shminux↑ comment by Shmi (shminux) · 2013-10-16T16:36:03.765Z · LW(p) · GW(p)
How is it 1984 if there are no secrets? You have full access to thoughts and motivations of whomever happens to be interested in performing a politician's job, assuming it exists.
Replies from: Lumifer↑ comment by Lumifer · 2013-10-16T17:02:51.988Z · LW(p) · GW(p)
"Power grows out of the barrel of a gun".
If I have the power to switch you off, I don't care that you have access to my motivations. You still have to do what I say.
And don't forget:
Replies from: Luke_A_Somers, Strange7He gazed up at the enormous face. Forty years it had taken him to learn what kind of smile was hidden beneath the dark moustache. O cruel, needless misunderstanding! O stubborn, self-willed exile from the loving breast! Two gin-scented tears trickled down the sides of his nose. But it was all right, everything was all right, the struggle was finished. He had won the victory over himself. He loved Big Brother.
↑ comment by Luke_A_Somers · 2013-10-17T18:21:06.886Z · LW(p) · GW(p)
Nowhere is there in this scenario presented the possibility of getting turned off.
Replies from: Lumifer↑ comment by Lumifer · 2013-10-17T18:39:52.202Z · LW(p) · GW(p)
That's implicit in being an em. Your computing substrate loses power and you're no more.
Replies from: Luke_A_Somers↑ comment by Luke_A_Somers · 2013-10-18T16:35:43.578Z · LW(p) · GW(p)
It's implicit that someone can turn you off, but you were talking about the telepaths. If they get life and death power over you, that's an important thing to point out about the scenario.
Replies from: Lumifer↑ comment by Lumifer · 2013-10-18T16:56:36.452Z · LW(p) · GW(p)
Somebody has life-and-death power over you. Are you envisioning a scenario where the community of telepaths is "closed" and no information leaks out of it..? That doesn't sound likely.
Replies from: Luke_A_Somers↑ comment by Luke_A_Somers · 2013-10-18T17:32:45.594Z · LW(p) · GW(p)
The person who already has life-and-death power over you has up to now declined to kill you. That's a good sign.
If the community of telepaths would have that power over you, you lose that degree of assurance.
↑ comment by Strange7 · 2014-01-28T02:37:33.751Z · LW(p) · GW(p)
If I have the power to switch you off, I don't care that you have access to my motivations. You still have to do what I say.
No, I have to do what you want. If you tell me to do something but it shows in your thoughts that you don't actually want that particular thing badly enough to risk whatever consequences you'd face for killing (or, say, torturing... which is a lot harder to justify as a matter of limited resources) me if I don't comply, I can still afford to resist.
comment by polymathwannabe · 2013-11-01T23:15:23.874Z · LW(p) · GW(p)
Unless everyone has an equal ability to effortlessly switch telepathy on and off, separately for inbound and outbound communication, at any moment, I'm not for it.
Replies from: shminux↑ comment by Shmi (shminux) · 2013-11-01T23:29:16.320Z · LW(p) · GW(p)
Why not?
Replies from: polymathwannabe↑ comment by polymathwannabe · 2013-11-02T00:06:12.929Z · LW(p) · GW(p)
If I'm set to be permanently "on," I can't control who reads what from my mind. If I'm set to be permanently "off," I'm not able to send special thoughts to my favorite people. I'd treat the question of whom I mind-talk to the same as the way I treat in real life the question of whom I mouth-talk to: case by case.
comment by Shmi (shminux) · 2013-10-20T14:54:33.525Z · LW(p) · GW(p)
SMBC take on shared experiences.
comment by Kyre · 2013-10-18T05:20:16.716Z · LW(p) · GW(p)
In the telepath society, is there a "polite" subculture where people don't look at each other's thoughts unless they're explicitly marked public (as a subcultural norm) ? If so I probably would opt in. I might opt in anyway, but I'm not sure.
I would probably opt in to a society where one could optionally publish verified, authenticated, non-repudiatable thoughts and/or feelings.
(In shminux's further specified "mandatory write / voluntary read Facebook stream" model, access to your thoughts would be effectively logged, because the accesses would appear in the accessor's stream. I presume that you could also be alerted to people accessing your stream.)
Replies from: shminux↑ comment by Shmi (shminux) · 2013-10-18T16:39:46.507Z · LW(p) · GW(p)
My model is indeed "mandatory write / voluntary read", possibly with optional reading alert. As for this "polite subculture", given that "impolite" people cannot be forced to follow the public/private tag, I don't know how likely this could emerge. Possible, of course, in the same way that passers by in large cities or passengers in a public elevator don't meet each other's eyes. But this habit arises from some basic level of privacy, which simply does not exist to begin with among telepaths.
Replies from: Kyre↑ comment by Kyre · 2013-10-19T02:02:57.123Z · LW(p) · GW(p)
I don't mean that everyone else respects the privacy of the polite subculture, just that they respect it internally, and they don't invade the privacy of those outside. Does that put them at a disadvantage ? Of course; but so does not carrying a firearm in places with very permissive gun control laws, and yet lots of people choose not to.
I guess your hypothetical includes universal telepathy and evaporation of the idea of privacy. In which case I am fighting the hypothetical (sorry). But I think that while a universal end of privacy may well follow from the technical capability of universal telepathy, I can imagine other plausible, less socially uniform, outcomes.
comment by JDelta · 2013-10-17T14:12:58.605Z · LW(p) · GW(p)
A lot of the discussion in both the original post and the comments seems to be stuck in '21st century' (present day social) mindsets.
Firstly, the idea of there being only one or two societies for people to participate in seems highly unlikely in a simulation environment. Computation of the mind is much more complex than computation of the environment, meaning it's difficult to foresee a reason why engineers in charge of the 'sim' would limit the number of environments to anything less than whatever the individuals decide to dream up.
Of course it's likely that over time certain environments (IE 'planets' or 'spaceships' or whatever scenery/location/laws of physics) will be appealing to a large number of people, leading them to become much more popular and almost like a 'real life' for members who spend much of their life there.
Assuming that two of the most popular of these environments were 'telepath' and 'non-telepath' mirrored versions of reality, then it's true that many members would spend time deciding which of these societies to spend time with.
This decision is much less finalistic than earth decisions, however. If a person get's sick of one of these, they could, presumably switch into the other environment with very little effort.
I believe you would see a bit of a split between those who enjoy the more 'primitive' enjoyments in life, who would choose to go with non-telepath, and those with a more spiritual/philosophical interest who would be more likely to start on the telepath route.
Sports, competition, games, sex, many other recreational activities are, one would imagine, more fun in a non-telepath environment. However, I couldn't imagine the incredible rush of knowledge, wisdom and compassion that would come with being able to hear everyone's thoughts.
To a degree, one could assume a telepath society to be calm, happy, esoteric, and relatively laid back. Since everyone has dark secrets and nasty character flaws, only those with enough courage to lay bare their darkest secrets, and enough wisdom not to judge others would be able to enjoy a true uncensored mental link with someone else.
There is a middle ground, however, that I believe is most likely.
In a simulation environment:
- instant travel is trivial
- there is virtually unlimited resources
- any idea or concept, whether social or political, etc., will ultimately stand or fall on the merits of how many fellow players want to experience a reality where your idea is implemented in some way. I'm sure some Neo-Nazi player might try to create a fourth reich, and no doubt it would be a beautiful imitation of 1930s Germany, but it's hard to imagine that many players excited to join him in his new Nazi Germany locale, as free as he is to create and promote it.
One could easily imagine a scenario where the most popular 'reality' would be something like:
Clone of real earth except:
- positive thoughts are broadcast, negative or embarrassing ones are not. (presumably trivial to a computer that can simulate a planet and billions of minds)
- there are numerous large land masses where people can build their own cities and businesses instantly accessible by anyone.
- due to unlimited resources, lots of different venues exist, each with different 'rules' (ie program state) in regards to telepathy.
↑ comment by NancyLebovitz · 2013-10-18T23:01:24.936Z · LW(p) · GW(p)
I assume sex would be more fun in a telepathic environment, and likewise for dancing and making music. Competition probably wouldn't work.
I'd be delighted for Neo-Nazis to have their own virtual environment, with nothing to kill but NPCs, at least so long as I could trust they'd stay there.
Replies from: JDelta↑ comment by JDelta · 2013-10-23T00:06:42.546Z · LW(p) · GW(p)
True, I was referring more, I suppose, to the 'romantic' or 'chase' element of a sexual relationship which a lot of people find exciting.
And, yes, in a simulation environment, one can reasonably assume most desired realities would be implemented, as well as a massive degree of crossover allowable to the player, IE the ability to interact in a (sometimes limited) fashion with players who have chosen different environments. The possibilities are endless.
comment by Baughn · 2013-10-16T19:17:25.212Z · LW(p) · GW(p)
Pretty sure I'd opt out. I would instead try to start a society where skills, not feelings or thoughts, are the shared elements. Doing the necessary research if required.
I realise this may lead to a Hansonian hell-world, but if by this time there isn't a reliable means of blocking that outcome we're doomed anyway.
comment by kilobug · 2013-10-16T07:30:15.555Z · LW(p) · GW(p)
I would probably like it, at least if there is a way to keep a few thoughts private and/or to temporarily "disconnect". Like, I'm a big fan of Gaïa/Galaxia in the Foundation cycle.
Replies from: shminux↑ comment by Shmi (shminux) · 2013-10-16T15:12:26.540Z · LW(p) · GW(p)
You are not forced to listen to everyone's thoughts and feelings. Or anyone's, for that matter. This part is voluntary.
if there is a way to keep a few thoughts private
What would be your reason for it?
comment by Shmi (shminux) · 2013-10-16T06:24:38.813Z · LW(p) · GW(p)
Huh, I'm surprised that some of the commenters expect to want to opt out. I'd love to live in a society free of status games, constant guessing of thoughts, feelings and intentions, relationship screwups, embarrassment... I am having trouble thinking of disadvantages of full telepathy+empathy. Well, maybe becoming too complacent or something. Assuming it's a bad thing.
Replies from: Lumifer, TheOtherDave, NancyLebovitz↑ comment by Lumifer · 2013-10-16T16:07:18.362Z · LW(p) · GW(p)
Huh, I'm surprised that some of the commenters expect to want to opt out.
We have interestingly different baselines.
I am surprised anyone wants to opt in.
Replies from: shminux↑ comment by Shmi (shminux) · 2013-10-16T16:27:45.935Z · LW(p) · GW(p)
I'm still waiting for anyone here to articulate why not opt in.
Replies from: Lumifer↑ comment by Lumifer · 2013-10-16T16:53:31.718Z · LW(p) · GW(p)
Several reasons.
One is that you are basically defenseless. Your society of ems looks completely benevolent. May I suggest that in reality humans possess very considerably capability for malice and evil? You are making the assumption that future societies will enjoy considerate civil liberties, be democratic, etc. -- I see no good reasons for this assumption.
On a trivial level consider someone like a slightly psychotic disgruntled ex. She wants to make your life hell -- and sure, you can see that in her thoughts, she's not hiding that. But you can't hide from her either -- she can fine-tune her application of pain to you by watching real-time feedback: she wants you to suffer and she can see precisely what makes you suffer more.
On a less trivial level, consider what peer pressure looks like in the society without the privacy of your own thoughts. What if that society, for example, turns out to be religious (like the current societies)? Are you ready to be part of a shunned minority?
You're imagining a society which consists of people who look like the circle of your friends, just more of them. And sure, all your friends are kind and reasonable people, there isn't much to fear from them. But that's not the society you will get -- go read some popular media, some tabloids for a bit and imagine inhabiting the same mind space with these people.
Replies from: TheOtherDave, shminux, TheOtherDave, TheOtherDave↑ comment by TheOtherDave · 2013-10-16T17:52:14.613Z · LW(p) · GW(p)
On a trivial level consider someone like a slightly psychotic disgruntled ex. She wants to make your life hell -- and sure, you can see that in her thoughts, she's not hiding that. But you can't hide from her either
Sure.
In the real world, my psychotic disgruntled ex has the physical ability to stand outside my front door and sing Barry Manilow songs all day, has the physical ability to throw bags of flaming poop at my head, has the physical ability to blow my limbs off with a shotgun. In shminux' telepath-world, they also have the ability to fine-tune their abuse based on an accurate telepathic perception of my reactions, so they're able to get even nastier than that.
But the thing is, in the real world, my ex is not actually limited by their inability to fine-tune their abuse based on an accurate telepathic perception of my reactions. Long before they get to that point, we've collectively stepped in and done something about it.
So the question becomes, can we do anything about it in shminux' telepath-world? E.g., can incorrigible flaming-poop-throwers be exiled or otherwise intervened with?
You seem to be assuming they can't, but I don't see why that should be true.
Perhaps the problem is scale? I would agree, for example, that if it turns out that basically everyone is an incorrigible flaming-poop-thrower to everyone else in the privacy of our own minds and the only reason we don't notice in the real world is that we're ignorant of each other's true thoughts... well, sure, in that case the telepathic society would suck incorrigibly.
I don't think that's likely, though.
↑ comment by Shmi (shminux) · 2013-10-16T17:10:59.041Z · LW(p) · GW(p)
Interesting. I will have to think more about it. My immediate reaction is that many of the situations you describe will not have a chance to occur at all, but I don't have a good argument at this point, beyond "everyone can see your malicious intentions".
Replies from: knb↑ comment by knb · 2013-10-17T02:35:56.083Z · LW(p) · GW(p)
Most people who act evilly have their evil motives opaque even to themselves. The overwhelming majority of people think they are good and believe their motives are pure.
Replies from: FeepingCreature↑ comment by FeepingCreature · 2013-10-18T08:07:52.760Z · LW(p) · GW(p)
There might be a whole new class of conformance pressure due to being able to clearly perceive how everybody else sees you.
↑ comment by TheOtherDave · 2013-10-16T17:43:52.500Z · LW(p) · GW(p)
On a less trivial level, consider what peer pressure looks like in the society without the privacy of your own thoughts. What if that society, for example, turns out to be religious (like the current societies)? Are you ready to be part of a shunned minority?
I think about this a lot, but my expectations are radically different from yours. Peer pressure depends critically on maintaining the illusion of a uniform mainstream position. The awareness of just how much variance there actually is in real populations tends to destroy its effectiveness.
I expect a telepathic society to experience much less in the way of peer pressure than my current society, where 90% of the population can claim to believe X, even though they really don't, because they see that 90% of their neighbors are claiming X and they don't want to be singled out for defection.
But sure, if I'm wrong and the telepathic society turns out to be the kind of narrow-minded thoughtpolice scenario you have in mind, I will be surprised and regret my choice.
↑ comment by TheOtherDave · 2013-10-16T17:29:19.189Z · LW(p) · GW(p)
go read some popular media, some tabloids for a bit and imagine inhabiting the same mind space with these people.
It's possible that the reason I don't find this sort of reasoning compelling is that I'm just an unusually unsavory person, but there's nothing I find in popular media that doesn't resonate with some part of my own psyche, or that I expect doesn't have its analogs in my friends' minds.
Those parts aren't dominant, and I don't endorse them, but they're certainly there and I'm aware of them.
So the idea that it would be some kind of novelty to share a mind with such awful thoughts strikes me as sort of odd. I already do, I always have, I always will.
Sure, some of the awful thoughts will be novel... just as some of the brilliant, kind, and lovely thoughts will be novel... but I doubt it will be as much as a full sigma out from where I already am. I'm not some kind of unsullied snowflake whose purity ought not be besmirched.
And honestly, though I have no real way of knowing for sure, I doubt I'm all that unusual in this regard. I'm with Solzhenitsyn here: "the line dividing good and evil cuts through the heart of every human being. "
↑ comment by TheOtherDave · 2013-10-16T14:15:34.975Z · LW(p) · GW(p)
FWIW, I'm entirely with you here, but I'm unsurprised by the responses. I've had variations of this conversation with people for years, and the "that would be awful!!!" reaction is by far the most common one I get from people who think seriously about it at all.
I'm not really sure what the difference is, though I have some theories.
From my perspective, my mind is already a cobbled-together collective constructed from lots of distinct and frequently opposed subunits (set A) that reside in my brain, which interact in various ways with both each other and with subunits in other brains (set B).
Moving to a mode of living where the interactions between A and B are as high-bandwidth as the interactions within A and B probably means I would stop identifying so much as set A. In the limit, that implies that the construct "I" currently refers to would stop existing in any particularly important way. All of me would instead be participating in a vast number of different constructs, including but not limited to "I".
That seems like a win to me, but I can sort of understand why people are averse to the idea.
Replies from: Viliam_Bur↑ comment by Viliam_Bur · 2013-10-16T14:27:27.346Z · LW(p) · GW(p)
You risk not only losing your identity, but also your values. Merging with people having other values than you (imagine: psychopaths) is not the same as merging with people who are very similar to you, just in different bodies.
Your old values could disappear like: "meh, that's some nonsense one of my old bodies used to believe, but I don't care about such stupid things anymore".
Replies from: TheOtherDave, TheOtherDave↑ comment by TheOtherDave · 2013-10-16T14:44:20.236Z · LW(p) · GW(p)
I'll add to the above that I run this risk in the no-telepathy scenario as well. I run that risk every day in the real world.. I am not a well-designed intelligence with fixed values; I am a human being and my brain experiences value drift in response to various inputs. This is perhaps a bad thing, but it's nevertheless true.
Yes, the risk is intensified as the number of interactions increase, and as the bandwidth of those interactions increases... it's easier to preserve the values I had as a child if I live in a small town with people who mostly share my values than if I move to a large heterogenous city, and the risk would intensify still further in a scenario like a telepathic collective.
But I choose to live in a heterogenous city rather than a homogenous small town. I choose to read blogs written by people who don't share my values. Why would I not choose to live in a telepathic collective if that option were available? Is there some absolute threshold of acceptable risk of value drift I should avoid crossing?
↑ comment by TheOtherDave · 2013-10-16T14:34:59.740Z · LW(p) · GW(p)
Yes, I risk coming to identify as a mind that has different values than I currently identify with.
This doesn't really change anything, though. Those other values exist out there already, instantiated in running brains, and they are already having whatever effects they have on the world. The only difference is that currently they are tagged as "other", and that is enforced by the insulation between skulls. In the new world, they might get tagged as "me".
While I appreciate the fact that for many people this distinction is incredibly important, it just doesn't seem that important to me. To the extent that the existence of bad values in N distinct nodes of a system has bad consequences, it has the same bad consequences whether I tag one of those nodes as "me" or not.
Replies from: Viliam_Bur↑ comment by Viliam_Bur · 2013-10-17T07:07:11.284Z · LW(p) · GW(p)
These are very good points I wouldn't have thought about.
I guess my preferences for 10% of people having an opinion X (where 90% have opinion non-X) over a hive mind which 10% believes and 90% disbelieves in X has two sources:
1) Overconfidence about the ability of those 10% of people to somehow outsmart the remaining 90%. For example, if we speak about rationality, I hope the rationalists are able to win.
2) An intuition that if you mix food with crap, the result is not half-food-half-crap but crap. Again, specifically for rationality, having a few rational and many insane people might be better than having everyone mostly insane, or even waist-deep in the valley of bad rationality.
But both of these objections are pretty dubious.
In other words, I believe that rationality (or any other value) can somehow benefit from being separated, from having a local power as opposed to being a tiny minority in a large place.
Replies from: TheOtherDave, TheOtherDave↑ comment by TheOtherDave · 2013-10-17T15:51:41.173Z · LW(p) · GW(p)
if you mix food with crap, the result is not half-food-half-crap but crap.
To the extent that this is true, then it follows that there are no rational humans, nor even half-rational humans, but simply irrational humans. After all, every human mind is a mixture of rational and irrational elements.
↑ comment by TheOtherDave · 2013-10-17T15:48:58.734Z · LW(p) · GW(p)
FWIW, I agree that rationality (or any other value) can benefit from being densely concentrated rather than diffuse, which seems to be what you're getting at here.
To say that a little differently: consider a cognitive system S, comprising various cognitive agents. Let us label Sv the set of agents that are aligned with value V, and Snv the set of agents that oppose V. If I draw a graph of all the agents in S, how they interact with one another, and how strong the connections between them are, and I find that Sv has strong intra-set connections and weak inter-set connections with Snv, I expect S's judgments and behaviors to be more aligned with V than if Sv has weak intra-set connections and strong inter-set connections with Snv.
I just don't think it matters very much whether those connections are between-mind connections or within-mind connections. It matters enormously in the real world, because within-mind connections are much, much stronger than between-mind connections. But the whole point of telepathy is to make that less true.
And I think it matters even less where the label "Dave" gets attached within S, though in practice in the real world I tend to attach that label to a "virtual node" that represents the consensus view of the set of agents instantiated in my brain, thanks to that same within/between distinction. And again, telepathy makes that distinction less important, so where "Dave" gets attached within S is less clearly defined... and continues not to matter much.
Replies from: Viliam_Bur↑ comment by Viliam_Bur · 2013-10-17T16:31:14.870Z · LW(p) · GW(p)
Yes, it's about concentration. I imagine that some things are multiplicative, for example traits like "learns a lot about X" and "spends a lot of time doing X" give better output if they happen to be the traits of the same person (as opposed to one person who learns a lot but does nothing, and another person who does it a lot but doesn't understand it).
It's not just about agents, but about resources like memory. I don't know how well and how fast could the telepaths use each other's memory, or habits, or mental associations, or things like this. Seems more efficient if "caring about X" and "remembering many facts about X" are in the same person, otherwise there are communication costs.
Replies from: TheOtherDave↑ comment by TheOtherDave · 2013-10-17T17:11:18.140Z · LW(p) · GW(p)
I don't know how well and how fast could the telepaths use each other's memory, or habits, or mental associations, or things like this.
Sure. To the degree that I assume that telepathy does not successfully bridge the distance between minds, such that between-mind operations remain less efficient than within-mind operations, then I agree with you... in that case, a telepathic society is more like the real world, where minds are separate from one another, and the between/within mind distinction matters more.
But (for me) the important issue is the degree of internode connectivity. Whether those nodes are in one mind or two is merely an engineering detail.
traits like "learns a lot about X" and "spends a lot of time doing X" give better output if they happen to be the traits of the same person (as opposed to one person who learns a lot but does nothing, and another person who does it a lot but doesn't understand it).
Similarly to the above, I agree completely that they give better output if they are tightly linked than if they are loosely linked or not linked at all. I would say that whether this tight linkage occurs within one person or not doesn't matter, though. Again, in the real world we can't separate them, because tight linkage between two people is not possible (I can't use your knowledge to do things), and if telepathy doesn't help us do this then we also can't separate them in the OP's hypothetical.
↑ comment by NancyLebovitz · 2013-10-16T07:39:24.523Z · LW(p) · GW(p)
Imagine being in a telepathic linkage with people who are habitually very angry (perhaps especially at people like you), depressed. cruel, unfocused, and/or have whatever mental/emotional traits especially get on your nerves.
Now, it's possible that the telepathic society has ways of moderating those effects-- this is suggested by the fact that most people who join it stay there, though it's also conceivable that it has really strong propaganda. It may also be that a lot of mental dysfunction is caused by fear of being alone, and a telepathic society alleviates that.
It's also possible that the telepathic society is good for most people, but there are some people who are just cross-grained to it. It actually wouldn't surprise me if there's more than one telepathic culture in addition to people who want to be like 21st century humans.
In short, it isn't obvious to me that there's something wrong with people who don't want to live in the telepathic society.
Replies from: shminux↑ comment by Shmi (shminux) · 2013-10-16T16:31:55.987Z · LW(p) · GW(p)
Imagine being in a telepathic linkage with people who are habitually very angry (perhaps especially at people like you), depressed. cruel, unfocused, and/or have whatever mental/emotional traits especially get on your nerves.
I was unclear. Your own thoughts are public, but you are not forced to read everyone's thoughts.
Replies from: NancyLebovitz↑ comment by NancyLebovitz · 2013-10-16T16:53:18.891Z · LW(p) · GW(p)
That's intriguing, but I wonder how it would play out in practice. Suppose you're concerned that some interaction is going badly because of malice towards you. You can check, and the good news might be that it was an honest mistake on someone's part.
On the other hand, how would a limited telepathic society of the kind you describe handle malice?
Replies from: TheOtherDave, shminux↑ comment by TheOtherDave · 2013-10-16T17:31:10.372Z · LW(p) · GW(p)
Well, let's start with what we know and build out from there: how would you characterize how our current societies handle malice?
↑ comment by Shmi (shminux) · 2013-10-16T17:07:52.436Z · LW(p) · GW(p)
On the other hand, how would a limited telepathic society of the kind you describe handle malice?
Presumably malicious people would naturally be shunned, as most others recoil in disgust from their thoughts. They are also unable to cause any serious harm, as their intentions are open to scrutiny. I imagine that in a society where people come to rely on routinely going through each other's mental states being ignored is a big downside. Tight feedback loop.
Replies from: Lumifer, NancyLebovitz↑ comment by NancyLebovitz · 2013-10-16T17:41:58.903Z · LW(p) · GW(p)
Suppose you have some strong preference which is generally hated-- imagine that your telepathy society had started before efforts had been made to make homosexuality socially acceptable.
There's still quite a bit of prejudice, but it's not universal.
It's possible that telepathy would lead to the mainstream realizing that there's nothing especially wrong with homosexuality, but that's hardly guaranteed.
The thing you're missing is that malice directed against people one doesn't like can be quite a strong pleasure.
Replies from: TheOtherDave↑ comment by TheOtherDave · 2013-10-16T18:04:37.057Z · LW(p) · GW(p)
How would you characterize the process that resulted in homosexuality becoming socially unacceptable in the first place? And how would you characterize the process that resulted in homosexuality becoming increasingly socially acceptable?
In my experience, an important part of the former process is marginalizing the unacceptable minority and encouraging the "mainstream" to think of them as basically alien. "Othering" them, to use a bit of popular jargon. And an important part of the latter process is getting people to acknowledge the actual perspectives of the unacceptable minority.
I expect the former to be a lot more difficult and the latter easier when we can all experience their thoughts.
So, yeah, I expect it to be a lot harder in the shminux-telepathy scenario to get these sorts of arbitrary strong hatreds started in the first place, and a lot easier to get rid of them. Is it guaranteed? No, of course not. But I like my odds a lot better than in the "normal" society, where harmful prejudice is demonstrably possible. (To put it mildly.)
The thing you're missing is that malice directed against people one doesn't like can be quite a strong pleasure.
Sure, of course it is, agreed.
Smashing people's windows in the real world can be a hoot, too.
And yet, despite the fact that we all have the physical ability to smash each others' windows, it somehow turns out that most windows stay unsmashed.
Why do you think that is?
For my part, I think it's because most people are capable of abstaining from an act that would be pleasurable if the act is sufficiently antisocial, and generally choose to do so.
Replies from: NancyLebovitz↑ comment by NancyLebovitz · 2013-10-16T23:14:50.371Z · LW(p) · GW(p)
You're giving reasons why it might work-- I still think my reasons are strong enough for it to be reasonable to not be an early adopter.
Replies from: TheOtherDave↑ comment by TheOtherDave · 2013-10-16T23:59:18.399Z · LW(p) · GW(p)
No, I'm not just saying "it might work". As I said, I like my odds a lot better in the telepathic society than in the "normal" society, for the reasons I gave.
If you disagree with me, and think your odds are better in the normal society, that's a good enough reason to opt out. Which is fine. But I've made a claim and you disagree with it.
I have no idea where "early adopter" comes from here; in this scenario both societies have existing members.