ike's Shortform
post by ike · 2019-09-01T18:48:35.461Z · LW · GW · 11 commentsContents
11 comments
11 comments
Comments sorted by top scores.
comment by ike · 2020-12-31T17:35:39.751Z · LW(p) · GW(p)
The other day a piece fell off of one side of my glasses (the part that touches the nose.)
The glasses stay on, but I've noticed a weird feeling of imbalance at times. I could be imagining it, I'm able to function apparently regularly. But I was thinking that the obvious analogy is to filmography: directors consciously adjust camera angles and framings in order to induce certain emotions or reactions to a scene. It's plausible that even a very slight asymmetry in your vision can affect you.
If this is true, might there be other low hanging fruit for adjusting your perception to increase focus?
comment by ike · 2019-09-01T18:48:35.747Z · LW(p) · GW(p)
Does the anti-p-zombie argument imply you can't simulate humans past some level of fidelity without producing qualia/consciousness?
Or is there a coherent position whereby p-zombies are impossible but arbitrarily accurate simulations that aren't conscious are possible?
Replies from: jimrandomh, TAG, Pattern↑ comment by jimrandomh · 2019-09-10T01:39:50.959Z · LW(p) · GW(p)
Yes, it implies that. The exact level of fidelity required is less straightforward; it's clear that a perfect simulation must have qualia/consciousness, but small imperfections make the argument not hold, so to determine whether an imperfect simulation is conscious we'd have to grapple with the even-harder problem of neuroscience.
Replies from: ike↑ comment by ike · 2019-09-11T22:46:17.762Z · LW(p) · GW(p)
How does it imply that?
I have intuitions on both sides. The intuition against is that predicting the outcome of a process can be done without having anything isomorphic to individual steps in that process - it seems plausible (or at the very least, possible and coherent) for humans to be predictable, even perfectly, without having something isomorphic to a human. But a perfect predictor would count as an arbitrarily accurate simulation.
↑ comment by Pattern · 2019-09-01T21:57:35.938Z · LW(p) · GW(p)
The argument might have been "if qualia it exists, then it probably has observable effects - you without qualia would be different from you with qualia".
Replies from: ike↑ comment by ike · 2019-09-02T00:01:28.410Z · LW(p) · GW(p)
But obviously you as a simulation is different in some aspects from you in reality. It's not obvious that the argument caries over.
Replies from: Pattern↑ comment by Pattern · 2019-09-02T04:52:12.447Z · LW(p) · GW(p)
2) What aspects?
1) You are assuming qualia exists?
Replies from: ike↑ comment by ike · 2019-09-02T14:39:39.174Z · LW(p) · GW(p)
Causality is different, for one. You in reality has a causal structure where future actions are caused by the state of you in the present + some inputs. You in the simulation has a causal structure where actions are caused by the simulator, to some extent.
I'm not really assuming that. My question is if there's a coherent position where humans are conscious, p-zombie humans are impossible, but simulations can be high fidelity yet not conscious.
I'm not asking if it's true, just whether the standard argument against p-zombies rules this out as well.
Replies from: aleph_four↑ comment by aleph_four · 2019-09-07T00:18:57.335Z · LW(p) · GW(p)
Well if qualia aren’t epiphenomenal then an accurate simulation must include them or deviate into errancy. Claiming that you could accuracy simulate a human but leave out consciousness is just the p-zombie argument in different robes