How would not having free will feel to you?

post by Shmi (shminux) · 2013-06-20T20:51:33.213Z · LW · GW · Legacy · 70 comments

Contents

70 comments

Given the spike in free-will debates on LW recently (blame Scott Aaronson), and the usual potentially answerable meta-question "Why do we think we have free will?", I am intrigued by a sub-question, "what would it feel like to have/not have free will?". The positive version of this question is not very interesting, almost everyone feels they have free will most all the time. The negative version is more interesting and I expect the answers to be more diverse. Here are a few off the top of my head, not necessarily mutually exclusive:

Epistemic:

Psychological:

Physical:

For me personally some of these are close to the feeling of "no free will" than others, but I am not sure if any single one crosses the boundary.

I am sure that there are different takes on the answers and on how to categorize them. I think it would be useful to collect some perspectives and maybe have a poll or several after.

70 comments

Comments sorted by top scores.

comment by Jack · 2013-06-20T22:04:38.704Z · LW(p) · GW(p)

Traditionally philosophical literature recognizes a distinction between fatalism and causal determinism. Causal determinists don't say that human actions and thoughts don't or can't affect the future, just that those actions and thoughts are themselves causally dependent on prior conditions and the laws of nature. Fatalism is generally defined as something like "no matter what you think or do the future cannot be altered. It's Oedipus killing his father despite doing everything he can to avoid it.

The psychological and physical answers above sound to me like what it would be like if fatalism were true than if hard determinism were true. I mean, hard determinists think we don't have free will; it would be odd if they thought not having free will involved novel psychological or physical experiences.

Anyway, I think the only kind of free will that makes sense is compatiblist free will. So when it comes to what it would be like not to have free will I'm concerned with what it would be like if my decisions weren't made according to my wishes and reasons.

I would expect to have a very hard time predicting my behavior or explaining it after the fact with any kind of rational model. If I did have reasons and preferences to back up my decisions they would probably be invented after the decision had been made. Post-facto rationalizations instead of genuine reflection about the decision. Also, I would probably feel like I had a lot of reasons and preferences that contradicted each other and would often find myself doing x even though I wanted to want to not do x.

Rest assured, I have never experienced any of that.

Replies from: komponisto, shminux
comment by komponisto · 2013-06-21T01:33:55.802Z · LW(p) · GW(p)

Fatalism is generally defined as something like "no matter what you think or do the future cannot be altered. It's Oedipus killing his father despite doing everything he can to avoid it.

Interestingly, fatalism in that sense is compatible with free will as I would understand it: Oedipus can make any choice he likes, it's just that some specific future event is guaranteed to happen (the particular way in which it happens depending on the choice he makes).

Perhaps one could distinguish between "local" and "global" free will, or -- even more sensibly -- among degrees of ability to influence the future. (We're all in Oedipus's position to some extent, since no matter what we do, the sun is going to rise tomorrow, etc.)

comment by Shmi (shminux) · 2013-06-20T22:27:41.675Z · LW(p) · GW(p)

I understand, sort of, the positions in the free will debate. I am mostly interested in the cognitive part of it, what would it feel like to not have one, in order to pinpoint what people really mean when talking about free will, as opposed to what they say/think they mean. I found almost no discussions of this among philosophers, not surprisingly.

I would expect to have a very hard time predicting my behavior or explaining it after the fact with any kind of rational model. If I did have reasons and preferences to back up my decisions they would probably be invented after the decision had been made. Post-facto rationalizations instead of genuine reflection about the decision. Also, I would probably feel like I had a lot of reasons and preferences that contradicted each other and would often find myself doing x even though I wanted to want to not do x.

Rest assured, I have never experienced any of that.

Of course not, neither have I. That's just crazy talk.

Replies from: None
comment by [deleted] · 2013-06-21T20:44:29.656Z · LW(p) · GW(p)

I found almost no discussions of this among philosophers, not surprisingly.

Not surprisingly because there are so many thousands of articles and books on free will that it's hard sort through them, or not surprisingly because you found that philosophers (as you expected) did not discuss the phenomenology of free will in those many thousands of articles and books?

In my experience, it is almost impossible to find any (not obviously false) idea that hasn't been pretty throughly discussed at some point. One of the greatest things about, say, the idea of FAI is that friendliness as a formal decision procedure is a genuinely new idea. That's extremely, fantastically hard to do. I expect the problem isn't going to be finding philosophers who discuss this very question, but sorting through the mountain of such discussion for anything good or interesting.

Philpapers has a pretty okay search engine. I found this article by John Searle searching for 'free will phenomenology'. I didn't read it, but the abstract leads me to believe it has some discussion of the phenomenology of free will.

Replies from: shminux
comment by Shmi (shminux) · 2013-06-21T21:59:20.003Z · LW(p) · GW(p)

Uh, sorry, I should have phrased it differently. What I meant was not just that this angle is probably not very popular, but also that it is hard to find, given that the specific language philosophers would use would be unfamiliar and non-obvious to someone outside the field. Additionally, it would be a topic more likely to be studied in neuroscience, psychology or even psychiatry than in philosophy of mind. Routine paywalling doesn't help, either. But yes, I also admit to a certain prejudice against a discipline which has multiple warring schools arguing opposite points with no ability to reconcile them. It's like if physics was mostly arguing about interpretations of QM.

Anyway, thanks for the links, I'll see if I can find something relevant. Feel free (as in "free will") to link if you come across something, as well. I'm looking at the Searle's article you linked (pdf), and it has one working definition of the feeling of free will:

I did not sense the antecedent causes of my action as setting causally sufficient conditions. I did not sense the reasons for making the decision as causally sufficient to force the decision, and I did not sense the decision itself as causally sufficient to force the action. In typical cases of deliberating and acting, there is, in short, a gap, or a series of gaps between the causes of each stage in the processes of deliberating, deciding and acting, and the subsequent stages. If we probe more deeply we can see that the gap can be divided into different sorts of segments. There is a gap between the reasons for the decision and the making of the decision. There is a gap between the decision and the onset of the action, and for any extended action, such as when I am trying to learn German or to swim the English Channel, there is a gap between the onset of the action and its continuation to completion. [...] Without the conscious experience of the gap, that is, without the conscious experience of the distinctive features of free, voluntary, rational actions, there would be no problem of free will.

There is some more here.

I found no descriptions like "perception of lack of free will may manifest in the following ways..." As a result, the definition above is directly contradicted by some of the no-free-will accounts posted in the comments to the OP. That it takes only one post by an amateur on an online forum to poke holes in a well-cited paper of a renown professional philosopher is not very encouraging.

Replies from: None
comment by [deleted] · 2013-06-21T22:31:38.271Z · LW(p) · GW(p)

I donno, that description seems to me to capture in a general way most of what people have pointed to as the experience of (or lack of) free will. Searle might say that the experience of the lack of free will is the experience of there being no such gaps where we generally expect them. That is, the experience of anticedent causes or reasons being causally sufficient for an action in the way perceptions and the causes of perceptual experiences are causally sufficient to make me believe that there's a tree in front of me.

I mean, in some sense anyone who gives you an answer to the question 'how does it feel to have/not have free will', where 'free will' is understood as metaphysical free will (the kind that's at stake in discussions about determinism, say) is confused. Metaphysical free will or lack thereof can't feel like one thing or another. We can however distinguish between free will (in a non-metaphysical sense) and coercion, or free will in action and the kind of non-free relationship we have with our perceptual beliefs. And the 'gap' thing is a fair account of that phenomenological distinction.

Replies from: shminux, shminux
comment by Shmi (shminux) · 2013-06-21T23:14:28.783Z · LW(p) · GW(p)

TheOtherDave gave one first-hand contradicting account. There the experience of "no free will" came from too large a gap, not from not having a gap. Alternatively, one can think of the feeling of being compelled and unable to resist some perceived external or internal force as "lacking free will", like an addict in the movie Flight both dialing her dealer and praying he wouldn't answer. The gap is still present, but what is absent is, in Searle's words, the stages of deliberating and deciding.

We can however distinguish between free will (in a non-metaphysical sense) and coercion, or free will in action and the kind of non-free relationship we have with our perceptual beliefs.

I am not sure what this "non-metaphysical sense" is. Perceptual? Then it seems like a tautology.

And the 'gap' thing is a fair account of that phenomenological distinction.

I don't see how the 'gap' disappears in the above examples.

comment by Shmi (shminux) · 2013-06-22T03:34:39.257Z · LW(p) · GW(p)

Eh, I wasn't fair in my other reply. The idea of a gap seems like a neat one, and probably matches some of the free-will experiences, just not all or even a majority of them.

comment by Leonhart · 2013-06-21T12:23:39.738Z · LW(p) · GW(p)

It's all about the physical components. Not having free will feels like sleep paralysis; it's a disconnection from your muscles.

Have you ever been playing an immersive video game, for example a story-driven FPS, and become completely used to looking around, moving etc. with the controls; and then hit a point where the developers take that control away to do something narrative? Suddenly your head turns to look at something and you didn't tell it to? The fraction of a second of vertigo and confusion, before you remember you're playing a game? That's no-free-will.

Feeling unable to complete thoughts I would like to think through, as if someone censored them

I can't make sense of this one. How would you even tell? You can't have an endless tower of meta-thoughts monitoring the first thought to see if it halted.

Replies from: TheOtherDave, Luke_A_Somers
comment by TheOtherDave · 2013-06-21T15:01:47.808Z · LW(p) · GW(p)

How would you even tell?

I've had the subjective experience of feeling unable to complete thoughts I would like to think through, mostly as a consequence of post-traumatic stress.

It most often manifests as something like "A, therefore B, therefore C, therefore oh look a bird!" repeating over and over, along with an intuitive feeling that all the aborted chains are pointing roughly in the same direction. It sometimes manifests as "A, therefore NOT B" in cases where if I abstract over the particulars sufficiently it's obvious that A -> B, and I can't figure out why on earth I concluded NOT B in any particular instance. (Often followed by "A, therefore NOT B" in the particular instance again.)

I typically describe both of those experiences as my mind skittering off the surface of a thought rather than engaging with it, though the actual experience isn't kinesthetic like that at all.

That said, I don't experience any of that as being at all related to the sensation of "free will" the OP is talking about.

Replies from: NancyLebovitz, Leonhart
comment by NancyLebovitz · 2013-06-21T16:05:25.207Z · LW(p) · GW(p)

This points at the question of how much free will do people who believe in free will think they have? There might be a difference between free will and omnipotence about one's thoughts and decisions.

comment by Leonhart · 2013-06-21T15:23:00.492Z · LW(p) · GW(p)

Yikes. Another thing I hadn't realised could actually happen in real life.

comment by Luke_A_Somers · 2013-06-24T14:02:13.131Z · LW(p) · GW(p)

I don't remember that, but I do remember gorging my character on food because I was hungry.

Took me a minute to figure that one out.

comment by gjm · 2013-06-20T23:19:16.857Z · LW(p) · GW(p)

Your question seems ambiguous between these two:

  • If we didn't have free will, what would it feel like?
  • What sorts of feelings-like would give us the impression of not having free will?

Of course the answer to the first might turn out to be "exactly like we feel now, since in fact we don't have free will" or "exactly like we feel now, since having or not having free will, as such, makes no difference to what it's like to be us".

Replies from: shminux
comment by Shmi (shminux) · 2013-06-20T23:43:30.347Z · LW(p) · GW(p)

I only care about answerable questions, and the first one isn't without first defining free will. The second one is about subjective experiences, and so is perfectly answerable.

Replies from: gjm
comment by gjm · 2013-06-20T23:51:11.874Z · LW(p) · GW(p)

Understood and (on the whole) agreed. But I think the question, as phrased, is liable to suggest the first at least as much as the second.

Replies from: Error
comment by Error · 2013-06-21T11:39:22.803Z · LW(p) · GW(p)

Datapoint: I first interpreted it as #1, as gjm suggests.

(And I failed to notice my confusion when the example answers didn't seem to match up well with the question. That irritates me.)

comment by Brendan · 2013-06-21T10:47:52.310Z · LW(p) · GW(p)

When I read Eleizer's posts on free will, and then spent time thinking it through myself, I came to the conclusion that the question was (non-obviously) incoherent, and that this is what Eleizer also thinks. More specifically, that when you Taboo free will you find yourself trying to say that your brain is not controlled by physics, rather "you" do, where "you" has to be not physics, which is really just "magic". I started thinking further about questions and realised that there are whole classes of questions for which neither "Yes" or "No" is the correct response. The most basic example is a contradiction. P = A is true & A is false.

"What would it feel if we actually had P?" "Uhh, hold on a second, that is nonsense." "I know we can't have P but what if we did! What would it feel like??"

I find this understanding incredibly valuable, because it is a traditional philosopher's downfall that they try to answer every question no matter what. You can fully explain the problem with the question yet still feel like there is a question. So the problem becomes psychological instead of philosophical and can be attacked by figuring out what our brain is doing when it tries to tackle the question.

Additionally, I think it is a Mind Projection Fallacy to ask a question like "What would infrared look/feel like if we could see it?" Stimuli don't necessitate particular perceptions. You could rewire your blue and red photoreceptors in order to feel blue, I mean, 420nm light, as red [untested]. I'm mildly confident that this can be extended to asking about how free/not free will would feel like. It's likely you can run with certain aspects of the idea without going loopy I still think the overall idea is wobbygobby.

Another mistake is thinking that being unpredictable gives you more free will. Either you are controlled by predictable atoms or controlled by unpredictable dice rolling mechanics, or what ever. Neither case gets you any closer to having "you" in charge. I even prefer being predictable rather than randomised. That shit's crazy.

Anyway, I expect I'm going to get a lot of disagreements stemming from us having different ideas about free will. Everyone struggles to formalise from the informal in different ways, so if you see something I've said as nonsense, ask yourself if it is just that we are using the same string of characters to point to different ideas.

Replies from: None
comment by [deleted] · 2013-06-21T20:24:17.170Z · LW(p) · GW(p)

I find this understanding incredibly valuable, because it is a traditional philosopher's downfall that they try to answer every question no matter what.

Just a nitpick, but this is true of no philosopher, living or dead (by which I mean I can give you examples of every significant philosopher rejecting a question at some point). The idea that we should always keep the 'reject the question' door open is good advice, but we shouldn't frame it with some historically false claim.

comment by CAE_Jones · 2013-06-21T04:48:18.203Z · LW(p) · GW(p)

The absence of free will seems like a pretty good description of how I've felt for the past 8 years or so. "And now, I'm going to do X. ... Right? Other half of brain that hears this part giving orders? X? Any day now? You realize that not listening to me means we'll have to put up with getting yelled at by people who will Fundamental Attribution Error us into oblivion, right?" Though maybe Fundamental Attribution Error isn't quite right, since there's a strange enduring "Not much decision-making power" quality rather than the "So you think you're better than everyone and just blowing everyone off and choosing to do other things?" qualities that I've been accused of.

This always reminds me of an incident from first grade, that probably isn't relevant to the discussion, so I'll probably post a summary elsewhere.

The trouble seems to be a matter of perspective. From an outside view, with all the relevant data, an intelligent agent should be able to predict the decisions of most humans under most conditions. We have such a phrase as "out of character" for when our models of people do not match their behavior. Out of character moments are considered a sign of poor writing in fiction, and something to fear in human society. When we fail to predict someone's actions correctly, we usually try to find out what was missing from our model and update accordingly (consider reactions to every mass shooting in America in the past 20 years).

I kinda feel like, by the time "free will" has been reduced to something logical, attaching the word "free" to it ceases to be meaningful. Free from overt coercion? I wouldn't even go that far; would we claim that a financially secure middle-class member of the local most-privelaged class has free will, whereas a prisoner of an abusive, manipulative overlord does not?

I find it interesting to think about how many of one's actions trace back to conscious decisions, and how many conscious decisions lead to the decided upon actions. Once college brought my inability to do what I want to the forefront, and I'd put up with people yelling nonsense about it for a few years while fumbling around in search of a solution, I wound up talking to some psychologists. One of them mentioned the emotions -> thoughts -> actions pyramid, with regards to what we can control. The general idea is that we have most control over our actions, and least over our emotions, and some over thoughts. Apparently this feels true enough for enough people that it translates into something resembling the idea of free will.

Maybe I'm drawing ridiculously large conclusions from one example, but if I had to put those three experiences into order of control, I'd say I have way more control over my thoughts than my actions. I might call it "freedom of thought" and not "freedom of choice", but "freedom of thought" is as much a misnomer as "free will". It's free because it feels like I'm doing it, not like that is the inevitable outcome of the particular arrangement of matter in my vacinity? Even though I'm often frustrated when there is a lot of noise coming from nearby, or loud conversations, etc, because it's hard not to let them influence my thoughts so as to continue thinking about what I initially wanted to think about?

Something tells me I've missed the point of all the other comments, here. :(

[edit]: Looking over the comments again, I feel like I've spent way too much time arguing about whether or not the concept of free will is meaningful. The more important part is that I've felt persistently like I am not in control of what I wind up doing for quite a while, and I can retrospectively identify similar moments from the past that I didn't think about in such terms at the time.

comment by TheOtherDave · 2013-06-20T22:56:20.006Z · LW(p) · GW(p)

Some earlier related thoughts here, in lieu of actually thinking about your question in any detail, which I might do later.

Replies from: shminux
comment by Shmi (shminux) · 2013-06-20T22:58:29.147Z · LW(p) · GW(p)

Right, I recall that it was discussed before, but could not find the link. Thank you!

comment by leplen · 2013-06-23T01:24:03.057Z · LW(p) · GW(p)

Slightly off-topic, but I don't want to start another free-will thread...

Would free-will represent an evolutionary cost?

If free will is to say that your decisions are not driven solely by your stimuli inputs, then it seems to me that a creature with free-will is by definition less responsive to its environment than a creature without it. A creature that is less responsive to its environment should be out-competed by a creature that is more responsive to its environment ceteris parisbus.

Even assuming that free-will is possible, is it likely, or would we expect "free-will" genes to get eliminated from the gene pool?

Replies from: OccamsTaser, army1987
comment by OccamsTaser · 2013-06-23T02:25:41.889Z · LW(p) · GW(p)

If we assume being reactionary to one's environment is purely advantageous (with no negative effects when taken to the extreme), then yes it would have died out (theoretically). However, freedom to deviate creates diversity (among possibly other advantageous traits) and over-adaptation to one's environment can cause a species to "put all its eggs in one basket" and eventually become extinct.

Replies from: gwern
comment by gwern · 2013-06-23T03:36:12.237Z · LW(p) · GW(p)

However, freedom to deviate creates diversity (among possibly other advantageous traits) and over-adaptation to one's environment can cause a species to "put all its eggs in one basket" and eventually become extinct.

You seem to be ascribing magical properties to one source of randomness. What special 'diversity' is being caused by 'free will' that one couldn't get by, say, cutting back a little bit on DNA repair and error-checking mechanisms? Or by amplifying thermal noise? Or by epileptic fits?

(Bonus points: energy and resource savings. Free will and no DNA error checking, two great flavors that go great together!)

Replies from: OccamsTaser
comment by OccamsTaser · 2013-06-23T04:28:36.574Z · LW(p) · GW(p)

You seem to be ascribing magical properties to one source of randomness.

Free will is not the same as randomness.

What special 'diversity' is being caused by 'free will' that one couldn't get by, say, cutting back a little bit on DNA repair and error-checking mechanisms? Or by amplifying thermal noise? Or by epileptic fits?

Diversity that each individial agent is free to optimize.

Replies from: gwern
comment by gwern · 2013-06-23T04:34:09.247Z · LW(p) · GW(p)

Free will is not the same as randomness.

In your usage, there is nothing distinguishing free will from randomness.

Diversity that each individial agent is free to optimize.

Huh? How is your link at all related?

What freedom to optimize does 'free will' give you that a RNG or PRNG of any kind, from thermal fluctuations to ionizing or cosmic radiation, does not?

comment by A1987dM (army1987) · 2013-06-23T09:38:44.474Z · LW(p) · GW(p)

Here's a problem where deterministic agents cannot win.

Replies from: gwern
comment by gwern · 2013-06-25T00:40:46.712Z · LW(p) · GW(p)

Deterministic agents without memory or access to randomness. You also don't need free will to implement a 'mixed' strategy rather than a 'pure' strategy, as that problem calls for.

Replies from: army1987
comment by A1987dM (army1987) · 2013-07-13T18:18:38.954Z · LW(p) · GW(p)

Given what leplen appears to mean by “free-will”, I wouldn't call any agent with access to randomness “deterministic” in this context.

comment by CronoDAS · 2013-06-21T03:41:17.265Z · LW(p) · GW(p)

Hmmm...

I suppose if I had the feeling that I could predict my own actions with certainty - as though I were able to compute my own input-output table - that would be like feeling like I didn't have free will.

I don't like introspecting, in general. It consistently makes me feel bad; my train of thought inevitably turns to the things about me that make me upset. Better to stay distracted and keep my mind on more pleasant things.

Replies from: Richard_Kennaway, shminux, Locaha
comment by Richard_Kennaway · 2013-06-21T16:48:37.119Z · LW(p) · GW(p)

I suppose if I had the feeling that I could predict my own actions with certainty - as though I were able to compute my own input-output table - that would be like feeling like I didn't have free will.

Yet people ordinarily predict their own actions all the time, quite reliably. For example, I predict that in a few minutes I will turn off the computer and go home, buying groceries at the supermarket on the way, and I already have a rough idea of what I will be doing this evening and tomorrow morning. There are factors that could interfere with this, but they rarely do, and the unpredictability comes from external sources (e.g. the supermarket is unexpectedly closed), not from me.

It is also interesting that people facing up to some hard moral choice will often, afterwards, talk in such terms as "I could not have done other than I did" (here is a random example of what I mean), or "Here I stand, I can do no other." (The attribution of the last to Luther is disputed, but whoever wrote it, it was undoubtedly written.)

I can also predict that unless your mind has gone awry by contemplating the conundrum of free will, you are not going to deliberately step in front of a bus. Are you "free" to do so when you have (I hope) compelling reasons not to?

Replies from: Richard_Kennaway, torekp
comment by Richard_Kennaway · 2013-06-21T19:13:55.401Z · LW(p) · GW(p)

For example, I predict that in a few minutes I will turn off the computer and go home, buying groceries at the supermarket on the way

And that is exactly what I did. And for my next trick, I ate because I was hungry, and tonight I will sleep when I am tired.

The sensation of free will is the experience that our acts seem to us to come out of nowhere. But they do come out of somewhere; a part of us that is inaccessible to experience. The sensation is real, but to interpret it at face value is like imagining that your head has no back because you cannot see it.

comment by torekp · 2013-06-22T19:39:08.068Z · LW(p) · GW(p)

David Velleman's concept of epistemic freedom provides a way to agree with both CronoDAS and Richard here. We can "predict" our acts in the broad sense of forming correct expectations. But we also know that we could form the opposite expectation in many cases and be correct in that case too. Last time I bought ice cream, I expected to say "chocolate" to the person behind the counter, and I did. But I could have expected to say "raspberry" instead, and if I had, that's what I'd have said.

Some prophecies are self-fulfilling. When I said "I'll have chocolate", that not only correctly predicted the outcome, but caused it as well. Self-fulfilling prophecies often allow multiple alternative prophecies, any of which will be fulfilled if made. Velleman says that intentions for immediate actions are typically self-fulfilling prophecies. There may be more to intention than that, but there is at least this much: that intentions do involve expectation, and the intention itself (and/or closely associated psychological processes) tends to bring it about.

comment by Shmi (shminux) · 2013-06-21T03:56:31.456Z · LW(p) · GW(p)

Thank you for not dodging the question with philosophical considerations!

I suppose if I had the feeling that I could predict my own actions with certainty - as though I were able to compute my own input-output table - that would be like feeling like I didn't have free will.

Interesting, so just going with the flow and not knowing what might happen next would feel like more free will to you? That seems almost like the opposite of what kalium suggests.

Replies from: CronoDAS
comment by CronoDAS · 2013-06-21T04:16:12.057Z · LW(p) · GW(p)

Interesting, so just going with the flow and not knowing what might happen next would feel like more free will to you? That seems almost like the opposite of what kalium suggests.

::follows link::

the main difference is that I would do things without a need to exert "willpower," and with less internal monologue/debate.

"Willpower" and "internal monologue/debate" seem like processes that reflect uncertainty about future actions - there's a subjective sense that it's possible that I could have chosen to do something else. I'm not sure I see any difference, really.

Replies from: DSherron
comment by DSherron · 2013-06-21T21:47:17.930Z · LW(p) · GW(p)

It's explicitly opposed to my response here. I feel like if I couldn't predict my own actions with certainty then I wouldn't have free will (more that I wouldn't have a will than that it wouldn't be free, although I tend to think that the "free" component of free will is nonsense in any case). Incidentally, how do you imagine free will working, even just in some arbitrary logically possible world? It sounds a lot like you want to posit a magical decision making component of your brain that is not fully determined by the prior state of the universe, but which also always does what "you" want it to. Non-determinism is fine, but I can't imagine how you could have the feeling of free will without making consistent choices. Wouldn't you feel weird if your decisions happened at random?

Replies from: CronoDAS
comment by CronoDAS · 2013-06-22T22:52:46.752Z · LW(p) · GW(p)

I sort of think of "agent with free will" as a model for "that complicated thing that actually does determine someone's actions, which I don't have the data and/or computational capacity to simulate perfectly." Predicting human behavior is like predicting weather, turbulent fluid flow, or any other chaotic system: you can sort of do it, but you'll start running into problems as you aim for higher and higher precision and accuracy.

Does that make any sense? (I'm not sure it does.)

Replies from: DSherron
comment by DSherron · 2013-06-23T01:01:37.581Z · LW(p) · GW(p)

I don't think it's particularly meaningful to use "free will" for that instead of "difficult to predict." I mean, you don't say that weather has free will, even though you can't model it accurately. Applying the label only to humans seems a lot like trying to sneak in a connotation that wasn't part of the technical definition. I think that your concept captures some of the real-world uses of the term "free will" but that it doesn't capture enough of the usage to help deal with the confusion around it. In particular, your definition would mean that weather has free will, which is a phrase I wouldn't be surprised to hear in colloquial English but doesn't seem to be talking about the same thing that philosophers want to debate.

Replies from: CronoDAS
comment by CronoDAS · 2013-06-23T01:48:39.077Z · LW(p) · GW(p)

I don't mean to imply that being difficult to predict is a sufficient condition for having free will... I'm kind of confused about this myself.

comment by Locaha · 2013-06-21T16:48:43.187Z · LW(p) · GW(p)

Hmmm....

Anytime you decide to do something and then act upon that decision, couldn't you say you predicted your own action?

I'm going to move my hand!

*moves hand*

Ha! No free will!

Replies from: CronoDAS
comment by CronoDAS · 2013-06-22T22:44:06.143Z · LW(p) · GW(p)

Well, yes, sometimes I can predict fairly accurately. Other times, it's harder.

comment by defectbot · 2013-06-21T01:14:29.139Z · LW(p) · GW(p)

Perhaps we should also ask "Why do we feel we have free will?". The simplest answer, of course, is that we actually do. Albeit, it wouldn't be beyond the scope of human biases to believe that we do if we don't. Ultimately, if we were certain that we couldn't feel more like we have free will than we already do, then we could reduce the question "Do we have free will?" to "Would someone without free will feel any differently than we do?".

Replies from: DSherron
comment by DSherron · 2013-06-21T21:59:22.805Z · LW(p) · GW(p)

Taboo "free will" and then defend that the simplest answer is that we have it. X being true is weakly correlated to us believing X, where belief in X is an intuition rather than a conclusion from strong evidence.

comment by JQuinton · 2013-06-25T15:34:46.290Z · LW(p) · GW(p)

I really can't imagine any subjective difference between having free will and not having free will. For example, "Observing myself act in ways I never intended to act" seems to be the conception we have of free will from video games/movies/etc. (e.g. in Dragon's Dogma, your pawns can get possessed by a dragon and then they run towards you trying to kill you while telling you that they're not in control of themselves) but I don't think we should generalize from fictional evidence.

comment by Richard_Kennaway · 2013-06-22T18:19:57.490Z · LW(p) · GW(p)

"what would it feel like to have/not have free will?"

I imagine the stereotypical idea of hypnosis: the subject finds themselves unable to resist the suggestions implanted by the hypnotist.

I do not know if this phenomenon actually exists.

Replies from: shminux
comment by Shmi (shminux) · 2013-06-22T20:28:56.323Z · LW(p) · GW(p)

I'm pretty sure it does, and yes, it would probably feel like not having free will to me.

comment by DSherron · 2013-06-21T17:38:38.820Z · LW(p) · GW(p)

I suspect that a quick summary of people's viewpoints on free will itself would help in interpreting at least some answers. In my case, I believe that we don't have "free will" in the naive sense that our intuitions tend to imply (the concept is incoherent). However, I do believe that we fell like we have free will for specific reasons, such that I can identify some situations that would make me feel as though I didn't have it. So, not actually having free will doesn't constrain experience, but feeling like I don't does.

Epistemically:

If I discovered that I was unpredictable even in principle; if randomness played a large role in my thought process, and I sometimes gave different outputs for the same inputs, then I would feel like I did not have free will.

Psychologically:

I have no consistent internal narrative to my actions. On reflection I discover that I could not predict my actions in advance, and merely rationalized them later. I notice that my actions do not tend to fulfill my preferences (this one happens in real life to varying degrees). I notice that I act in ways that go against what I wanted at the time.

Physically:

None. I am tempted to say that losing complete control of my body constitutes a loss of free will, but in reality it seems to closer reflect simply that my will cannot be executed, not that I don't have it (or feel like I have it).

Note: much of this is also heavily tied into my identity. It would be interesting to examine how interlinked the feelings of identity and free will really are.

comment by Ronak · 2013-06-21T16:02:07.202Z · LW(p) · GW(p)

Epistemic:
-> finding out that I can't, even given an exponentially bigger universe to compute in, be predicted.
It would also potentially destroy my sense of identity. Even if I can be predicted, I can do anything I want: it's just that what I'll want is constrained. However, if the converse is true, any want I feel has nothing to do with me (and my intuitive sense of identity is similar to 'something that generates the wants and thoughts that I have') and I'm not sure I'll feel particularly obliged to satisfy them.
(Warning: writing it out made it sound to me like post-hoc rationalisation. But far as I can make out, I believe it.)

-> finding out that I'm being controlled by direct neurological tinkering or very thorough manipulation.
Manipulation to some degree is very common, and the line between influence and manipulation is not very clean, but there is definitely an amount of manipulation that will make me go, 'no free will.' Basically, if you can make it clear that my wants are being constrained by intentions.

Psychological and physical: I find it hard to come up with anything that won't look normal. If my wants are being constrained, it'll just feel like stronger wants acting contrariwise. However, I can imagine something ineffable keeping me away from certain thoughts and wants... but that happens anyway too - Yudkowsky's written shitloads about that feeling (also, in a very different context, me).
Physically, involuntary motions happen all the time, and often I can't move quite the way I want to. Physical constraints, I dismiss them as.

comment by Brillyant · 2013-06-21T15:36:29.817Z · LW(p) · GW(p)

Well, since I suspect I don't have free will, I imagine not having free will would feel a great deal like how I feel right at this moment.

comment by ChristianKl · 2013-06-21T09:20:37.843Z · LW(p) · GW(p)

Knowing that someone out there already predicts my behavior perfectly

The halting problem shows that perfect prediction is impossible if you don't simulate the whole system. On the other hand there are plenty of cases where pretty good prediction is possible.

Observing myself act in ways I never intended to act, whether beneficial to me or not

When it comes to acting in way you don't intend to do, flirting is a good example. I remember an experience I was sitting next to a beautiful girl in a lecture.

We had good "chemistry". One time I look at her breast and she immidiately touched my arm and then automatically said "sorry". She had no conscious knowledge of the fact that she touched me in response to looking at her breast, so she excused her behavior.

I also caught myself one time touching her and immidiately excusing myself where I didn't consciously intend to touch her.

I don't think that such actions we take outside of our conscious awareness mean that we don't have free will.

Observing my arms/legs/mouth move as if externally controlled, and being unable to interfere

If you cut my nerves that control my arm and put in electrones to control my arm I don't think that would mean that I lack free will.

When it comes to psychological control, part of free will is the ability to give up control over your arm. I get more annoyed by my personal inability to give up some forms control as I would see that as given up free will.

comment by kalium · 2013-06-21T03:44:00.503Z · LW(p) · GW(p)

When I try to imagine what it would be like to feel a specific absence of free will, the main difference is that I would do things without a need to exert "willpower," and with less internal monologue/debate.

comment by [deleted] · 2013-06-21T01:53:56.132Z · LW(p) · GW(p)

Infants appear to have a mental life that is significantly different from children or adults. They may not have what a child or an adult would call free will, but they lack the means to tell us and they forget what it was like as they become children and adults.

Infants, children and adults who are asleep appear to have a mental life that is significantly different from that when they are awake. People who are asleep may not have what a person who is awake would call free will, but people who are asleep lack the means to tell us and they forget what it was like as they wake up. People who sleep uncontrollably (due to narcolepsy, being drugged or for other reasons) might have even less free will during that time.

People who take drugs inducing psychosis, again, appear to have a mental life that might not be what a straight person would call free will. Some mental illnesses involving compulsion, same thing. And again, people on drugs or with mental illnesses may not be able to convey to others what it feels like.

People with brain damage or especially low IQ - same.

So in each of these cases, experiencing a lack of free will includes the inability to convey what it is like. Thus, it would feel like not being able convey what it is like. Which I'd guess might be frustrating or sad.

Replies from: kalium
comment by kalium · 2013-06-21T04:15:18.755Z · LW(p) · GW(p)

Drug-induced mental states may be hard to describe, but people try pretty hard. Below are a few quotes from Erowid.org "experience reports" in which people describe not feeling free will. I don't see much of a common thread between them.

It is a very zen thing; everything seems so simple and just so. The beautiful, illusory nature of ego consciousness was just so obvious, so plain to see and easy to understand. In the absence of time, the paradox of free will and determinism vanishes. Life is a wonderful game, a grand, extraordinary drama and although we tend to get overly caught up in our roles, that's exactly what it's all about. The forgetting and the remembering, the getting lost and the coming home, over and over again.

I can’t remember what I else I tried, but every time everything would happen just the way it happened before, and I couldn’t change anything. It was like I (as God) was always one step ahead of me. Each time the universe was created, everything happened exactly the same, exactly the way it was supposed to happen.

My place in the universe is like that of a water molecule bouncing between sticks and boulders in a mountain stream. I don't know where we're going, I don't know why there's this force 'gravity' that pulls me and influences my motion, and I'm not sure I have any free will in choosing which boulder I will interact with, but for some reason, I don't really care. If this is what the universe is, then I will gladly partake.

The trip then became some sort of awful parody of my own existence, taking on a dark and demonic funhouse sort of feeling, where everything was distorted and mutated in some sort of horrifyingly comedic way. I looked at K and said 'dude, what the fuck?...' and pointed at K, then he pointed at P and P made a strange dismissive sort of gesture and then I said 'dude, what the fuck?...' and pointed at K and he pointed at P and, horrifyingly, the cycle continued. I found myself stuck in an incredibly long timeloop and I thought myself that, if I didn't play my part in the loop, it would end, but every time it came around to my turn to make my gesture and, thus, continue the loop, I would find myself pointing at K. I had somehow lost my free will it was horrifying. I truly believed I would be stuck in this loop eternally, but something strange happened: K's third eye opened up and he bent down and vomited out his consciousness.

I got up and paced around for a while, feeling as if I were outside of my body, yet still able to direct it, almost unconsciously. It was as if my free will was somehow usurped, yet I was still more or less in control, it is hard to describe that feeling.

comment by Shmi (shminux) · 2013-06-29T07:59:25.084Z · LW(p) · GW(p)

Here is one from reddit, tangentially related:

Jean-Paul Sartre is sitting at a French cafe, revising his draft of Being and Nothingness. He says to the waitress, "I'd like a cup of coffee, please, with no cream." The waitress replies, "I'm sorry, Monsieur, but we're out of cream. How about with no milk?"

comment by elharo · 2013-06-27T11:07:53.927Z · LW(p) · GW(p)

Is there a standard definition of "free will" that everyone is assuming, and for some reason I don't know? If so could someone please point to or summarize it?

I confess that the more I try to think about this and related questions, the more I find myself confused. At this point I'm not even sure what the question is. "How would not having free will feel to you?" is a grammatically correct English sentence; but it seems no more meaningful to me than a sentence such as "How would not having open feet feel to you?"

That is, I am not sure what we are even talking about here. What is the difference between free will and will?

Replies from: shminux
comment by Shmi (shminux) · 2013-06-27T16:45:44.361Z · LW(p) · GW(p)

Is there a standard definition of "free will" that everyone is assuming, and for some reason I don't know?

I think you nailed it, actually, there is none. The whole point of my post was to highlight this problem with the definition. Because having free will feels so natural and intuitive to most people, they tend to assume that everyone means basically the same thing by it, even if they have trouble spelling it out. But the complement of free will is not intuitive at all, and requires some thinking and retrospection. As you can see from the replies, the results are all over the map. Some feel that the complement of free will feels indistinguishable from free will, meaning that the whole term is vacuous for them. Others think that the lack of free will feels like knowing what you will do and being unable to change it. Or even wanting to change it. Yet others feel that not feeling the connection between your thoughts and actions feels like lacking free will. Some argue that my tentative classification in the OP is flawed to begin with.

That's the my whole issue with the free will debates: people think that they talk about the same thing, but they do not, so the debate is pointless until at least some basic definitions are agreed upon.

comment by mwengler · 2013-06-24T15:09:47.459Z · LW(p) · GW(p)

Free will is subtler than most of the suggested points in the post would suggest.

Randomness. If we can act randomly, that hardly proves or even suggests free will. My determinism could be probabilistic, resulting in things like "in this case I will take action A 20% of the time and action B 80% of the time." It is easy enough to implement something like this in simple computer code, although the objection might be made that it is a pseudorandom number generator we use to roll the dice. However, a random number generator based on measuring the voltage to picovolt accuracy across a room temperature resistor can be made, in this case it is the brownian motion of a bunch of electrons interacting with all sorts of other things around the resistor that produces the random number. This is of the same order of "true" randomness as actually rolling a fair dice would be.

Predictability. Even something which is completely deterministic may require a full detail "simulation" to predict fully. I believe this is the point of bringing up the halting problem in other posts. And the predictor would need to simulate a large chunk of the entire universe including all the 10^23! degrees of freedom of whatever random number generator we might be using to generate probabilistic answers.

The feeling that "I" am controlled by forces or consciousnesses outside of "me" wouldn't prove a lack of free will. My "chooser" might not be completely 1:1 and onto with my "consciousness." Then there would be things that surprised "me" and felt out of "my" control even though the locus of agency was completely within me.

For a long time I believed, very strongly, that I had free will. I thought, why would I even discuss the issue with someone who didn't think they had free will, after all, they were just arguing with me because they had to, not out of a choice about belief.

Now I don't think I have free will. Randomness doesn't make free will. Quantum uncertainty isn't a guarantee of free will by any means, just a guarantee of a kind of randomness that might be particularly difficult or even impossible to predict. Our only hope for "free will" is a fleeting one, that there is "physics" of consciousness in which the particles are conscious and the rules of interaction are called will. Turning the magic into physics, and making me wonder why we would call rule-described interactions in this new physical sphere any free-er than are newton's laws of billard balls in the presence of thermal noise.

Replies from: mwengler
comment by mwengler · 2013-06-24T15:11:33.484Z · LW(p) · GW(p)

I would add: I feel much worse since losing my "faith" in free will. And I may even be behaving more poorly. At least my language describing my behavior makes me sound like a worse guy, more amoral, more calculating. Of course it just may be my signalling that has gone downhill. It could be we need free will so that we can signal to the other humans that we are willing to drink the same kool-aid they drink, and important thing to know when taking collective action.

Replies from: None, Richard_Kennaway, TheOtherDave
comment by [deleted] · 2013-06-24T16:49:42.785Z · LW(p) · GW(p)

At least my language describing my behavior makes me sound like a worse guy, more amoral, more calculating.

This seems like a strange result. Being a 'worse guy', 'calculating' at least, and 'being amoral' if we regard that as a privative term, are all terms we would use only for things we generally treat as having free will. No one talks this way about things we all accept are governed entirely by physical law, like rocks.

comment by Richard_Kennaway · 2013-06-24T22:04:06.477Z · LW(p) · GW(p)

A Christian priest might say that you are in mortal danger of losing your soul.

A Buddhist priest might tell this story.

comment by TheOtherDave · 2013-06-24T16:11:16.685Z · LW(p) · GW(p)

Nothing obligates you to describe your behavior to others in language that makes them reluctant to trust you, even if you're a compatibilist. Admittedly, if you genuinely aren't trustworthy, then doing so is in others' best interests, since it causes them not to trust you... but if you're motivated to act in others' best interests in the first place, it's not clear to me in what sense you aren't trustworthy.

OTOH, if we're talking about the way you describe yourself to yourself, it may be worth asking whether your "worse-guy" self-description is more or less accurate than the less amoral, less calculating, better guy you previously made yourself sound like.

comment by ChristianKl · 2013-06-21T08:45:46.172Z · LW(p) · GW(p)

Knowing that I am in a simulation

How would you rephrase that if we taboo simulation?

Replies from: shminux
comment by Shmi (shminux) · 2013-06-21T16:02:46.770Z · LW(p) · GW(p)

Proving something like this?

comment by brilee · 2013-06-20T23:52:00.377Z · LW(p) · GW(p)

Your comments seem like they answer a slightly different question: "What would it feel for a person who has free will to not have free will?". The right question is, "What would it feel for a person who doesn't have free will to not have free will?". (brushing all concerns about what 'free will' is under the carpet for now)

Replies from: shminux
comment by Shmi (shminux) · 2013-06-21T00:06:26.814Z · LW(p) · GW(p)

I don't mean either of those. You may have a feeling of having or not having free will regardless of what some future agreed upon definition of free will might turn out to be. I'm asking what (their own) subjective experiences would people classify as "feeling of not having free will". TheOtherDave linked to his personal experience, which seems to match at least one on my list. I make no assertions of whether he does or does not actually have free will, and neither does he. In fact, I don't believe that a reasonable definition of free will can be made without people first agreeing on the answers to the question I asked.

comment by aelephant · 2013-06-20T22:44:47.890Z · LW(p) · GW(p)

Another question that occurred to me is, "Should we try to feel as if we don't have free will?" It seems like people would behave differently if they felt as if they didn't have free will; they would act less responsibly. So even if it is true that we do not have free will, might it not be better for philosophers to convince people that we do?

Replies from: torekp
comment by torekp · 2013-06-22T19:50:50.966Z · LW(p) · GW(p)

You are correct about the free-will-belief and responsibility connection:

Prior to taking the math test, half the group (15 participants) were asked to read the following passage from Francis Crick’s book The Astonishing Hypothesis (Scribner):

‘You,’ your joys and your sorrows, your memories and your ambitions, your sense of personal identity and free will, are in fact no more than the behavior of a vast assembly of nerve cells and their associated molecules. Who you are is nothing but a pack of neurons … although we appear to have free will, in fact, our choices have already been predetermined for us and we cannot change that.

In contrast, the other 15 participants read a different passage from the same book, but one in which Crick makes no mention of free will. And, rather amazingly, when given the opportunity this second group of people cheated significantly less on the math test than those who read Crick’s free-will-as-illusion passage above.

There have been a number of similar studies, and the result holds up. Meanwhile, experimental philosophers have done a fair amount of work to understand what the general public understands "free will" to mean. There are some resources there that have some bearing on this thread.

comment by Decius · 2013-06-21T03:00:48.014Z · LW(p) · GW(p)

On Monday, I have free will.
On Tuesday, I don't have free will.
I feel better on Tuesday, because I choose to get up very early on Monday to go to work and on Tuesday I have to sleep in.

I discount any effect by which someone else seems to control me as dissonance or terror; I define "me" to be the entity which controls me, which need not be the same as the entity which I might think perceives that control. (Terror comes from the thought that I might end up concluding that I am just a perception and cognition engine, without the interaction engine to alter or explore the world independently... or rather, that "I" am not such a thing, but that the perception and cognition isn't actually "me", and that terrifies it. I have the strangest waking nightmares.)

comment by ThrustVectoring · 2013-06-21T01:08:24.481Z · LW(p) · GW(p)

Not having free will feels exactly like how everything feels like right now. Having free will is not well-defined enough to feel like anything.