Do you believe in consciousness?
post by natural_number · 2010-10-03T12:10:17.662Z · LW · GW · Legacy · 10 commentsContents
10 comments
Do you believe in consciousness?
If you do what exactly would you define it as?
and what evidence do you have for its existence?
10 comments
Comments sorted by top scores.
comment by Scott Alexander (Yvain) · 2010-10-03T15:47:31.752Z · LW(p) · GW(p)
I feel like I am conscious. If consciousness doesn't exist, then there must be some sort of illusion of consciousness to make me feel like it does. I see no reason not to call this illusion of consciousness "consciousness", since it fully explains what we mean when we say "consciousness".
Therefore, consciousness exists.
This sounds like some sort of horrible misuse of Dark Arts, and it would be if you plug in any other term (for example, "God"), but since "consciousness" has no properties except "a certain experience people perceive themselves as having", whatever it is that makes them perceive that experience satisfies the definition of "consciousness".
This is not an explanation of consciousness or at all helpful in the formal study of consciousness, but it's why I don't think "consciousness doesn't exist" solves anything or is particularly useful. Okay, fine, it doesn't exist, now there's still exactly the same amount of mysterious experience to explain as before.
comment by Emile · 2010-10-03T17:38:19.014Z · LW(p) · GW(p)
Do you believe in consciousness?
If you do what exactly would you define it as?
What's the point of asking whether I believe in X if I'm allowed to come up with my own definition for X?
Say you're talking to people who just came back from exploring the wilderness. Isn't your question equivalent to asking them "Is there an item on your map labeled 'Mount Snowtop'"? People can come up with whatever labels they want for their map, the most important issue is how much they reflect the territory; having labels that 'carve reality at the joints' (i.e. having similar labels for similar things and different labels for different things) is a nice second. Whether any label has a specific name like 'consciousness' is way less important.
Replies from: natural_number↑ comment by natural_number · 2010-10-03T17:42:58.630Z · LW(p) · GW(p)
What's the point of asking whether I believe in X if I'm allowed to come up with my own definition for X?
you use your own definition of consciousness, the one you used before reading this post.
not some bizarre definition invented just to exploit some technical hole in the question
comment by David_Allen · 2010-10-03T19:19:44.988Z · LW(p) · GW(p)
Do you believe in consciousness? If you do what exactly would you define it as? and what evidence do you have for its existence?
Why are you asking this group instead of performing your own research?
There is plenty of online literature on this topic. When you ask such open-ended questions with no supporting material your questioning comes across as parasitical.
Instead of answering your questions I'll suggest a way of thinking about consciousness that you might be able use to direct your research.
I'll start with the context principle, context creates meaning and in its absence there is no meaning. So to identify consciousness, you must adopt the right context. Also, different contexts may result in different meanings of consciousness. You should not just identify different definitions of consciousness, you should also identify which contexts they are valid within.
The ideal would be to find the context that allows you to define the essential entities and relationships of consciousness. This will give you a domain model for consciousness. This domain model is an abstraction layer, hiding its implementation details. This abstraction will almost certainly be leaky, which would help explain the origin of qualia as lower level abstractions leaking through to the consciousness abstraction.
The separation of the consciousness domain model from its context will be tricky. For example the qualia of the color red is a lower level abstraction than consciousness. The label "red" for this qualia is culturally dependent and is a higher level abstraction than consciousness. Somewhere in the middle is our ability to "think" about red and its qualia in a conscious way.
comment by NihilCredo · 2010-10-04T03:31:58.163Z · LW(p) · GW(p)
In the wait for a better understanding of related subjects, treating "consciousness" as a verbose synonym for "I" has served me quite well.
comment by JohnDavidBustard · 2010-10-03T15:29:44.339Z · LW(p) · GW(p)
Each time a question like this comes up it seems to get down voted as a bad question. I think it's a great question, just one for which there are no obviously satisfactory answers. Dennet's approach seems to be to say, if you just word things differently its all fine, nothing to see here. But to me this is a weird avoiding of the question.
We feel there is a difference between living things and inanimate ones. We believe that other people and some animals are feeling things that are similar to the feelings we have. Many people would find it absurd to think that devices or machines were feeling anything. Yet whatever computational model of our minds we create, it is hard to identify the point at which it starts to feel. It is easy to create a virtual character that appears to feel but most people doubt that it is doing any more than simulating feelings, similar to the inauthentic patterns of behaviour we form when we are acting or lying. I think one can imagine what life would feel like to be constantly acting, performing reasoned interactions without sincere emotion, if at heart we are computational why does all interaction not feel this way?
To me this distinction is what makes consciousness distinct and special. I think it is a fascinating consequence of a certain pattern of interacting systems. Implying that conscious feelings occur all over the place, perhaps every feedback system is feeling something.
My justification for this theory is an attempt to provide a simple explanation of the origin of conscious experience, based on a belief that explanations should be simple and lack special cases (I don't find the idea that human beings are fundamentally distinct from other structures particularly elegant).
Replies from: humpolec↑ comment by humpolec · 2010-10-03T16:48:14.294Z · LW(p) · GW(p)
To me this distinction is what makes consciousness distinct and special. I think it is a fascinating consequence of a certain pattern of interacting systems. Implying that conscious feelings occur all over the place, perhaps every feedback system is feeling something.
This sounds like the point Pinker makes in How the Mind Works - that apart from the problem of consciousness, concepts like "thinking" and "knowing" and "talking" are actually very simple:
(...) Ryle and other philosophers argued that mentalistic terms such as "beliefs," "desires," and "images" are meaningless and come from sloppy misunderstandings of language, as if someone heard the expression "for Pete's sake" and went around looking for Pete. Simpatico behaviorist psychologists claimed that these invisible entities were as unscientific as the Tooth Fairy and tried to ban them from psychology.
And then along came computers: fairy-free, fully exorcised hunks of metal that could not be explained without the full lexicon of mentalistic taboo words. "Why isn't my computer printing?" "Because the program doesn't know you replaced your dot-matrix printer with a laser printer. It still thinks it is talking to the dot-matrix and is trying to print the document by asking the printer to acknowledge its message. But the printer doesn't understand the message; it's ignoring it because it expects its input to begin with '%!' The program refuses to give up control while it polls the printer, so you have to get the attention of the monitor so that it can wrest control back from the program. Once the program learns what printer is connected to it, they can communicate." The more complex the system and the more expert the users, the more their technical conversation sounds like the plot of a soap opera.
Behaviorist philosophers would insist that this is all just loose talk. The machines aren't really understanding or trying anything, they would say; the observers are just being careless in their choice of words and are in danger of being seduced into grave conceptual errors. Now, what is wrong with this picture? The philosophers are accusing the computer scientists of fuzzy thinking? A computer is the most legalistic, persnickety, hard-nosed, unforgiving demander of precision and explicitness in the universe. From the accusation you'd think it was the befuddled computer scientists who call a philosopher when their computer stops working rather than the other way around. A better explanation is that computation has finally demystified mentalistic terms. Beliefs are inscriptions in memory, desires are goal inscriptions, thinking is computation, perceptions are inscriptions triggered by sensors, trying is executing operations triggered by a goal.
comment by humpolec · 2010-10-03T14:44:46.572Z · LW(p) · GW(p)
Badly formulated question. I think "consciousness" as subjective experience/ability of introspection/etc. is a concept we all intuitively know (from one example, but still...) and more or less agree on. Do you believe in the color red?
What's under discussion is whether that intuitive concept is possible to be mapped to a specific property, and on what level. Assuming that is the question, I believe a mathematical structure (algorithm?) could be meaningfully called conscious or not conscious.
However, I wouldn't be surprised if it could be "dissolved" into some more specific, more useful properties, making the original concept appear too simplistic (I believe Dennett said something like this in Consciousness Explained).
Saying that "what we perceive as consciousness" has to exist by itself as a real (epiphenomenal) thing seems just silly to me. But then again I probably should read some Chalmers to understand the zombist side more clearly.
comment by Scott Alexander (Yvain) · 2010-10-03T15:47:19.759Z · LW(p) · GW(p)
I feel like I am conscious. If consciousness doesn't exist, then there must be some sort of illusion of consciousness to make me feel like it does. I see no reason not to call this illusion of consciousness "consciousness", since it fully explains what we mean when we say "consciousness".
Therefore, consciousness exists.
This sounds like some sort of horrible misuse of Dark Arts, and it would be if you plug in any other term (for example, "God"), but since "consciousness" has no properties except "a certain experience people perceive themselves as having", whatever it is that makes them perceive that experience satisfies the definition of "consciousness".
This is not an explanation of consciousness or at all helpful in the formal study of consciousness, but it's why I don't think "consciousness doesn't exist" solves anything or is particularly useful. Okay, fine, it doesn't exist, now there's still exactly the same amount of mysterious experience to explain as before.
comment by SearbyMason · 2010-10-03T22:59:23.082Z · LW(p) · GW(p)
Gosh . . . . some folks here are really down on your question n_n. It was as if your question somehow offended them. Or . . . . perhaps it was like salt in a wound. Looking at conscoiusness can be scary; we can all be a bit defensive when we are a bit frightened. The online Oxford Dictionary defines "belief" as - 'an acceptance that something exists or is true, especially one without proof'. So belief is not a word I would use regarding my experience of consciousness. That experience is one of a suite of feelings, all connected but fluid and variable. So, 'exact definition' is not easy but this works for me - ' I am a location of unexpected feelings '. No shareable proof, sorry. Afterall, I might be a butterfly dreaming I am a human being. But we seem to get by without proof. If any ask me, I say you can find proof by looking inside, at your own feelings, be your own lab, replicate what others say they have found by finding it yourself. Good luck.