Defining causal isomorphism
post by badtheatre · 2013-12-14T18:39:40.101Z · LW · GW · Legacy · 16 commentsContents
16 comments
I previously posted this question in another discussion, but it didn't get any replies so, since I now have enough karma, I've decided to make it my first "article".
This brings up something that has been on my mind for a long time. What are the necessary and sufficient conditions for two computations to be (homeo?)morphic? This could mean a lot of things, but specifically I'd like to capture the notion of being able to contain a consciousness, so what I'm asking is, what we would have to prove in order to say program A contains a consciousness --> program B contains a consciousness. "pointwise" isomorphism, if you're saying what I think, seems too strict. On the other hand, allowing any invertible function to be a _morphism doesn't seem strict enough. For one thing we can put any reversible computation in 1-1 correspondence with a program that merely stores a copy of the initial state of the first program and ticks off the natural numbers. Restricting our functions by, say, resource complexity, also seems to lead to both similar and unrelated issues...
Any takers?
16 comments
Comments sorted by top scores.
comment by kgalias · 2013-12-14T19:35:23.800Z · LW(p) · GW(p)
On the other hand, allowing any invertible function to be a _morphism doesn't seem strict enough. For one thing we can put any reversible computation in 1-1 correspondence with a program that merely stores a copy of the initial state of the first program and ticks off the natural numbers.
I don't understand why this is a counterexample.
Replies from: badtheatre↑ comment by badtheatre · 2013-12-14T20:24:12.824Z · LW(p) · GW(p)
Neither do I, but my intuition suggests that a static copy of a brain/the software necessary to emulate it plus a counter wouldn't cause that brain to experience consciousness, whereas actually running the simulation as a reversible computation would...
Replies from: kgalias↑ comment by kgalias · 2013-12-14T22:38:16.771Z · LW(p) · GW(p)
Can you provide some more background? What is a morphism of computations?
Replies from: badtheatre↑ comment by badtheatre · 2013-12-15T00:08:24.158Z · LW(p) · GW(p)
Those are basically the two questions I want answers to. In the thread I originally posted in, Eliezer refers to "pointwise causal isomorphism":
Given an extremely-high-resolution em with verified pointwise causal isomorphism (that is, it has been verified >that emulated synaptic compartments are behaving like biological synaptic compartments to the limits of >detection) and verified surface correspondence (the person emulated says they can't internally detect any >difference) then my probability of consciousness is essentially "top", i.e. I would not bother to think about >alternative hypotheses because the probability would be low enough to fall off the radar of things I should think >about. Do you spend a lot of time worrying that maybe a brain made out of gold would be conscious even >though your biological brain isn't?
We could similarly define a pointwise isomorphism between computations A and B. I think I could come up with a formal definition, but what I want to know is: under what conditions is computation A simulated by computation B, so that if computation A is emulating a brain and we all agree that it contains a consciousness, we can be sure that B does as well.
comment by Shmi (shminux) · 2013-12-15T00:10:42.463Z · LW(p) · GW(p)
(homeo?)morphic?
You probably mean homomorphism, unless you really mean a continuous (in some sense) invertible transformation between the two programs.
Anyway, the definition of homomorphism is a "structure-preserving map", so you need to figure out what "structure of consciousness" even means.
To start small, you might want to define the term "structure" for some simple algorithm. For example, do two different programs outputting first 10 natural numbers have the same structure? What if one prints it and the other uses TTS? Does it matter what language the numbers are in? What about two programs, one printing first ten numbers and the other second ten numbers? Can you come up with more examples?
Replies from: badtheatre↑ comment by badtheatre · 2013-12-15T02:02:13.063Z · LW(p) · GW(p)
What is TTS?
Replies from: shminux↑ comment by Shmi (shminux) · 2013-12-15T03:10:51.722Z · LW(p) · GW(p)
sorry... text to speech.
comment by Douglas_Knight · 2013-12-16T19:01:12.205Z · LW(p) · GW(p)
Here is a post on a related question. As I said there, this paper is relevant.
Replies from: badtheatre↑ comment by badtheatre · 2013-12-17T19:47:57.061Z · LW(p) · GW(p)
I don't see the relevance of either of these links.
Replies from: wuncidunci↑ comment by wuncidunci · 2013-12-17T23:26:25.649Z · LW(p) · GW(p)
Two points of relevance that I see are:
If we care about the nature of morphisms of computations only because of some computations being people, the question is fundamentally what our concept of people refers to, and if it can refer to anything at all.
If we view isomorphic as a kind of extension of our naïve view of equals, we can ask what the appropriate generalisation is when we discover that equals does not correspond to reality and we need a new ontology as in the linked paper.
Replies from: badtheatre↑ comment by badtheatre · 2013-12-18T18:48:53.502Z · LW(p) · GW(p)
Actually, I started thinking about computations containing people (in this context) because I was interested in the idea of one computation simulating another, not the other way around. Specifically, I started thinking about this while reading Scott Aaronson's review of Stephen Wolfram's book. In it, he makes a claim something like: the rule 110 cellular automata hasn't been proved to be turing complete because the simulation has an exponential slowdown. I'm not sure if the claim was that strong, but definitely it was claimed later by others that turing completeness hadn't been proved for that reason. I felt this was wrong, and justified my feeling by the thought experiment: suppose we had an intelligence that was contained in a computer program and we simulated this program in rule 110, with the exponential slowdown. Assuming the original program contained a consciousness, would the simulation also? And I felt strongly, and still do, that it would.
It was later shown, If i'm remembering right, that there was a simulation with only polynomial slowdown, but I still think it's a useful question to ask, although the notion it captures, if it does so at all, seems to me to be a slippery one.
comment by MrMind · 2013-12-16T09:43:17.537Z · LW(p) · GW(p)
Let's start small. Since we are talking about algorithms (better yet, about programs of a universal Turing machine), what about if two programs can match the same input on the same output?
Would that suffice as a definition of isomorphism, even if they have wildly different resource usages (including time)?
↑ comment by badtheatre · 2013-12-17T19:48:42.429Z · LW(p) · GW(p)
What if they don't output anything?
↑ comment by Baughn · 2013-12-16T15:26:30.588Z · LW(p) · GW(p)
Sadly, even that is unprovable - I believe there's a way to convert a solution for this problem to one for the halting problem.
Of course you can still do it for a large fraction of functions you're likely to see in real life.
Replies from: asr