How to notice being mind-hacked

post by shminux · 2019-02-02T23:13:48.812Z · LW · GW · 22 comments

Epistemic status: quite sure, but likely nothing new, I have not done the requisite literature search.

Human mind is not designed with security in mind. It has some defenses against basic adversaries that would have prevented our survival as a species, but not much more than that. It is also necessarily open to external influences because humans are social animals and cooperation is essential for survival. So, any security expert would be horrified at how vulnerable to adversarial mind hacking humans are. Humans generally do not like to accept how easy we are to sway, and how often it happens to us, but we can definitely see other people being easily influenced, and most of us aren't special in terms of mind security.

Another common term for it is "manipulation," but there is a slight difference. Manipulation generally presumes that the interests of the manipulator are detrimental to the mind being manipulated. Mind hacking does not have to have this negative connotation.

So, given that our minds are security sieves and we live in the world where influencing others (yet another term for mind hacking), and where we have certainly been mind-hacked by others over and over again, how does one notice a hack (unauthorized breach of mind security), whether when it is about to happen, when in progress, and after the fact? I am limiting the scope to just noticing. I am not implying that one has to try to stop a mind hack in preparation or in progress, or trying to undo it after it happened. Descriptive, not prescriptive.

Let's start with a a few obvious examples.

Your friend, noticing your distress, invites you to their church event, just to get your mind off things. A month later, you have converted to their faith, quote scriptures, believe in salvation and dedicate your life to spreading the gospel.

Or you come across a book, say, HPMoR or From AI to Zombies (I take partial credit/blame for the latter name), learn about rationality, get blown away by Eliezer's genius, and, next thing you know, you are at a local x-risk meetup worrying about an unaligned AI accidentally paper-clipping the universe and donating 10% of your income to an EA cause.

Or you pick up a Siouxsie and the Banshees CD in a record store (back when CDs and record stores were a thing), and soon you are a part of the goth subculture, deathhawk up every weekend, your carefully crafted Rihanna mixtape (another anachronism) gathering dust in the back of the bottom drawer.

Or maybe you end up at a kink munch, seemingly out of idle curiosity, then at a play party, then you discover your submissive side, end up dumping your vanilla partner and go on a sub frenzy and eventually settle as a slave to a Master/Mistress.

Not all mind hacks are as striking. But these somewhat extreme, yet also mainstream examples is a good place to start the analysis. Some salient features:

The above suggests how to notice the event post hoc (post hack?). The identity disconnect and the feelings around it are a telltale sign.

Noticing a hacking attempt or a hack in progress is probably harder. When skillfully executed, it never rises to the conscious level. You don't necessarily consciously notice your identity changing. Instead, you may be swept in the feelings of insight, being wowed, enlightened, or the opposite, intense guilt, shame and remorse, and often some combination of both. And even if we do recognize it for what it is, these same intense feelings can be too addictive to break the spell, and we can crave them more and more, and rationalize away what is happening. So, to provisionally answer the title non-question, watch out for the mind-hack-associated feelings.

What have been your experiences with noticing being mind hacked, intentionally or accidentally, or with doing it to others, whether on purpose or not?

22 comments

Comments sorted by top scores.

comment by mako yass (MakoYass) · 2019-02-03T00:38:24.991Z · LW(p) · GW(p)

I don't see how you can frame these as exploits or value shifts. If someone had told me I was going to get really into AGI alignment I would have said "uh I don't know about that" (because I didn't know about that), but I would not have said "that would definitely be bad, and it shouldn't be able to happen".

As far as I can tell, most cultural conversion processes are just boundedly rational updates in response to new evidence.

Goths are just people who have realised that they need to be able to operate amid gloom and sadness. It is an extended confrontation of the world's most difficult aspects. They clothe themselves in gloom and sadness so that others recognise that they are serious about their project and stop saying unhelpful things like "cheer up" and "stop being so weird". They have looked around and seen that there are many problems in the world that no one will face, so they have decided to specialise and give voice to these things. There isn't really anything wrong with that. Many societies had witches. They're probably a crucial morph in the proper functioning of a tribal superorganism.

Kinks are just distorted reflections of unmet needs, and exploring them can help a person to work through their problems.

If you are afraid of potential future identity shifts, that might be a problem. You should expect profound shifts in your worldview to occur as you grow, especially if there are (and there probably still are) big holes in your theory of career strategy, metaphysics, or self-knowledge. I know there are still holes in mine.

I didn't address the converting to religion example. It is a correct example, probably... Maybe. I can think of plenty of adaptive reasons an epistemic agnostic might want to be part of a church community. But even if you can get me to agree that it's correct, conversions like that are fairly rare and I have no idea what it would feel like from the inside so it doesn't seem very informative. I'm sure there are books we can read, but.. I must have looked at accounts of naturalist→christian conversions in the past and I couldn't make much sense of them. Maybe that means I should look closer, and try harder to understand. Maybe I should be more terrified by those stories than I am.

Replies from: Pattern
comment by Pattern · 2019-02-05T19:10:22.694Z · LW(p) · GW(p)

If identity shifts are good, can an identity shift to an unchanging state be bad?

Replies from: MakoYass
comment by mako yass (MakoYass) · 2019-02-06T03:16:27.042Z · LW(p) · GW(p)

Judging by the kinds of attitudes I see in myself and in elders, I think humans are evolved to get stuck somewhere eventually. We were not evolved to be able to live through so much change and adjust to it. Presumably there are some design benefits to this. Specialisation, commitment. In this era those are probably outweighed by the costs.

comment by avturchin · 2019-02-03T09:28:19.941Z · LW(p) · GW(p)

This seems to be true, but it changes the nature of truth: truth is just an effective hack in which you start to believe.

Side note: I once had an acid and after it I become hyper-suggestable. A person near me said that he has an allergy to strawberries, and I started to have panic attacks when I ate strawberries, despite the fact that I know that I don't have this allergy.

Replies from: shminux
comment by shminux · 2019-02-03T09:47:51.603Z · LW(p) · GW(p)
truth is just an effective hack in which you start to believe.

Exactly. That's one reason I dislike using the terms "true" and "fact" and instead prefer something like "a good model of.."

Replies from: Richard_Kennaway, avturchin
comment by Richard_Kennaway · 2019-02-04T13:44:48.250Z · LW(p) · GW(p)

What makes a model good, or to allude to a much-quoted aphorism of Cox that I find rather irritating, useful? What do you want to do with a model, that you can rate a model on its fitness for that purpose?

Replies from: shminux
comment by shminux · 2019-02-04T16:06:29.279Z · LW(p) · GW(p)

A useful model "pays rent", in Eliezer's words. Its predictions match future observations. It does not need to be related to any deep underlying truth, assuming one existed. It is also contextual, not absolute. It varies between people and cultures. Epicycles worked for astronomy and astrology for some centuries. Belief in God among religious people gets you to socialize and be accepted by the community, with all the associated perks, and so is useful, and therefore "good", if thriving in your community is what you value. If, on the other hand, self-consistency is what you are after, faith would not pay rent and you need to find a "better" way to make sense of the world.

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2019-02-05T13:41:11.504Z · LW(p) · GW(p)
Epicycles worked for astronomy and astrology for some centuries.

Nothing works for astrology.

Belief in God among religious people gets you to socialize and be accepted by the community, with all the associated perks, and so is useful, and therefore "good", if thriving in your community is what you value.

In practice, you are expected to actually believe, not merely pretend to believe -- that is, lie your way through every religious ritual.

self-consistency is what you are after, faith would not pay rent and you need to find a "better" way to make sense of the world.

Not a "better" way, but a better way. Reality has no inverted commas.

As it happens, my bicycle has developed a couple of mechanical problems. I already have a rough idea of what needs to be done, but the first thing I need to do when I have the time is examine it to discover exactly what needs to be done -- to discover what is true about the faults, and so be able to replace exactly the parts that need to be replaced, clean the parts that need cleaned, lubricate what needs lubricating, and adjust what needs adjusting. This talk about usefulness is an evasion of reality. What is useful to me regarding my bicycle is the truth about it, nothing less.

Whatever you find useful, if you are serious about it, you will find that you need to know the truth about it, to know what will achieve your purposes and what will not.

Replies from: shminux
comment by shminux · 2019-02-05T16:16:56.715Z · LW(p) · GW(p)

Seems we are talking past each other. You cannot imagine that a way of thinking where "reality" and the map/territory distinction is not primal, and I am tired of explaining why in my view this way of thinking is nice to start, but eventually harmful, I've been doing it on this blog for over 5 years. Take care.

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2019-02-06T15:44:18.812Z · LW(p) · GW(p)
Seems we are talking past each other.

It seems to me that we are talking directly to each other's statements.

and I am tired of explaining why in my view this way of thinking is nice to start, but eventually harmful, I've been doing it on this blog for over 5 years.

And I've been contesting it even longer, although less frequently.

Take care.

You too. I am happy to leave a conversation as soon as it appears that everything has been said, whether or not anyone has been convinced of anything. But I would still like to know how you would set about diagnosing and repairing a faulty bicycle. It is in simple, everyday matters like these that one can best see what pays rent and what does not.

comment by avturchin · 2019-02-03T10:30:56.220Z · LW(p) · GW(p)

The problem is that "good model" is "viral" model, not predictive model. Being predictive helps model to be more viral, but it is not necessary.

Replies from: shminux
comment by shminux · 2019-02-03T18:27:17.832Z · LW(p) · GW(p)

Right, certainly there are models that go viral without being accurate/predictive in scientific terms. They are nonetheless useful, at least in the context they proliferate. They often attain the status of "truth" and "fact", and in that way they are indeed "good models"

comment by Ustice · 2019-02-03T20:58:31.062Z · LW(p) · GW(p)

I'm willing to accept your meaning when you say "mind hack," but each of your examples read as personal epiphanies. From the inside I think that it feels like, "Wow! Yeah!" It's generally preceded by simlier smaller moments.

I have worried about this when I encountered some of the neoreactionary ideas on here and related communities. I could see myself—given that I have seen people who have had such shifts in thought—being swayed by reasonable arguements, and adopting what I currently believe to be repugnant conclusions, and thus spreading darkness in the universe.

Ultimately, I decided that if I change my mind, then it will have been for some very good evidence, and I can only trust that my future-self carefully considered the evidence and was pursuaded. Every idea that can be banished by the truth should be.

I believe it's the optimal state to be open to ideas, but this leads to the question, "Should we be open to the idea that we shouldn't be open to ideas?" Are there some ideas that are so repugnant that no matter the evidence indicating that it's a more optimal state of the universe, it's better to not know it?

I think that there aren't, but there are ideas that one should be cautious with. Evidence can be misleading. It's why I talk to my son about things that I see right and wrong in the world. The Dunning-Kreuger Effect can be nasty. It's sometimes hard to know that you're wrong.

So for me, when I have "Wow! Yeah!" moments about something that I don't like, I have others who can point out flaws that I miss, but I won't guard further against it.

Replies from: Pattern
comment by Pattern · 2019-02-05T19:07:41.905Z · LW(p) · GW(p)

Suppose we are open to ideas for a reason.* Then we would need a greater reason still, to not be so.

*This practice is associated with an idea about ideas, and might be applied only to lesser ideas. (Or apply with a degree inversely proportional to idea level. For instance, to prove that all actions are equally useful requires much more evidence, than to prove that one action is more than/less than/equal in value to another.)

comment by ChristianKl · 2019-02-03T12:19:50.490Z · LW(p) · GW(p)

It sounds like you classify all events where somebody else has an effect on yourself that makes you change your identity as mind hacking. Why do you find it useful to make that classification?

Replies from: shminux
comment by shminux · 2019-02-03T18:21:30.413Z · LW(p) · GW(p)

By analogy to the actual hacking, where the hacked system behaves significantly differently from the original in behavior, and this behavior is not intended or desired by/from the original system, and often benefits the hacker.

Replies from: ChristianKl
comment by ChristianKl · 2019-02-03T19:58:57.048Z · LW(p) · GW(p)

This suggests that people have an intention to be closed minded and to not change their self-identity in the face of any evidence that it might be beneficial to them.

I myself don't have any intention to prevent any change to my self-identity and it seems to be foolish to categorically reject it.

comment by Viliam · 2019-02-06T02:05:20.511Z · LW(p) · GW(p)

Another frequent feature of a mind hack is that suddenly there is an important authority, which wasn't important before (probably because you were not even aware of its existence).

In case of manipulation, the new authority would be your new guru, etc.

But in case of healthy growth, for example if you start studying mathematics or something, it would be the experts in given area.

comment by Elo · 2019-02-04T05:54:30.340Z · LW(p) · GW(p)

Identity is supposed to shift. If your identity is fixed, why would that be a good thing? Seems like that's the self delusion bug

comment by Pattern · 2019-02-02T23:57:04.000Z · LW(p) · GW(p)

What makes "conversion" different from "deconversion"? (Aside from a life of Pi scenario where someone is converted to 3 religions.)

Replies from: Viliam
comment by Viliam · 2019-02-06T01:59:33.283Z · LW(p) · GW(p)

It doesn't have to be always like this, but it seems to me that the process of conversion often includes installing some kind of threat. "If you stop following the rules, all these wonderful and friendly people will suddenly leave you alone, and also you will suffer horrible pain in the hell." So a mind of a converted person automatically adds a feeling of danger to sinful thoughts.

The process of deconversion would then mean removing those threats. For example, by being gradually exposed to sinful thoughts and seeing that there is no horrible consequence. Realizing that you have close friends outside the religious community who won't leave you if you stop going to the church, etc.

More generally: less freedom vs more freedom. (An atheist is free to pray or visit a church, they just see it as a waste of time. A religious person can skip praying or church, but it comes with a feeling of fear or guilt.)

comment by Richard_Kennaway · 2019-02-04T13:48:35.947Z · LW(p) · GW(p)

What you do changes who you are.

That includes whatever you do to avoid this happening.