Posts
Comments
A lot of words but you don't grapple with the hard problem of consciousness. When I look at the sun, how can you know I feel/see the same thing as you? Yes I'll use words, 'yellow', 'warm', 'bright' etc because we've been taught those label what we are experiencing. But it says nothing about whether my experience is the same as yours.
But they're all just words and suffer the same problem. What if an AI says the right words?
But if I said 'oh that sunset makes me feel so warm and fuzzy', yes I'm using the same words you might use, but how can you know they're the same subjective experience?? You say 'describing', that relies on a set of words, but do they point to the same thing in everyone?
I'm glad you've read it, I had a moment of thought yesterday that as modern fields (computer science, rationality studies etc) begin to deal with consciousness they'll re-invent the wheel when other fields (eg philosophy, psychology) have been thinking about this for awhile.
Putting bats aside, what could convince you that my subjective experience (I promise you I'm a human) is substantially similar to yours?
Unfortunately the more we write about consciousness the further we seem to be getting away from confronting the hard problem of consciousness. Has the author read Nagel's Seeing Like a Bat?
I haven't been to a rationality dojo before but would like to attend, do we need to register or anything?