Response to Susan Schneider’s The Philosophy of ‘Her’
In a March 2nd, 2014 New York times article, Susan Schneider used the recent film ‘Her’ to analyze some of the philosophical challenges to the concept of mind-uploading, whereby a person’s mind is transferred from a brain to some sort of computer. She presented some common (although admittedly fascinating) hypothetical thought-experiments, drawing conclusions to match. Below, I offer alternative explanations for these scenarios.
Schneider claims right off the bat, of Samantha the life-like artificial intelligence portrayed in the film, “few humans would want to join Samantha, for to upload your brain to a computer would be to forfeit your consciousness.” She is putting the cart before the horse. The attitude of the public should follow from the public itself, namely in the form of a survey, whereas Schneider’s claim comes across as a presumption that the publicly has already accepted her thesis and therefore must consequently agree with her. Let’s not get ahead of ourselves.
Schneider writes “…we could never be certain that programs like Samantha were conscious…although you can know that you yourself are conscious, you cannot know for sure that other people are…all you can do is note that other people have brains that are structurally similar to your own and conclude that since you yourself are conscious, others are likely to be conscious as well.” She is right on these counts. Namely, we deduce that other humans are conscious in the follow manner:
- We recognize that all humans have similar brains.
- We accept as a premise the consciousness of our own brain.
- Therefore, we grant the assumption of similar consciousness to other human brains.
Given the inherent inaccessibility of subjective states (that we definitively cannot verify the inner consciousness even of other humans, much less Homo, Australopithecines, great apes, mammals, animals, or even computerized intelligences) we must simply move on from this philosophical dead end. To do so, we must decide upon the salient features of a mind. For those who hold that biological human brains are a requirement of consciousness, there is no further debate. They cannot be proven wrong nor can they prove themselves right. So be it. For those who view the *behavior* of brains as the crucial property of minds, we can conceive of systems embedded in other physical substrates that exhibit comparable behavior, and which may therefore be conscious. This is the only launching point for further debate, so it from where I proceed.
Schneider offers a hypothetical scenario in which, to paraphrase, Theodore uploads his mind to a computer via destructive scanning and subsequent software modeling of the scan. She then asks, “would he succeed in transferring himself…Or would he…succeed only in killing himself, leaving behind a computational copy”? She argues, “Ordinary physical objects follow a continuous path through space over time…his mind would not follow a continuous trajectory…his precise brain configuration would be sent to a computer.” Although true, the physical brain isn’t the important entity here, the mind is. The mind is not a physical object *at all* and therefore properties of physical objects (continual path through space and time) need not apply. The mind is akin to what mathematicians and computer scientists call “information”, for brevity a nonrandom pattern of data (consult the venerated Claude Shannon for a thorough treatise). A protest might be that data is non-changing, like a printed document, while the mind evolves over time, but this is a specious distinction. Audio clearly represents a signal changing over time, yet may be “fixed” relative to time via recording in a physical medium (a record, tape, CD, etc.). The data can be embedded either “within” time or “outside” time by converting the time dimension to an additional dimension in space, such as the length of a tape, the groove of a record, etc.).
A spatial dimension literally substitutes for the absent time dimension. Closer to our target of the mind, consider the ongoing physiological development of an organism (a growing plant or animal). Such growth is a pattern of changing information embedded both in space and time (obviously involving additional spatial dimensions over audio, or even video). We understand some biological growth systems in sufficient detail to practically describe them as three-dimensional videos across time, (gene expression, mitosis, meiosis, embryonic development, etc.). These systems present a pattern of information, embedded in a physical substrate, changing its structure (and consequent informational embedding) over time. I believe Schneider has utterly misconceptualized the issue when she speaks of physical objects remaining continuous in space in time in a way that mind-uploading would not. If the mind is a pattern of information embedded in physical matter, then its embedding can be converted “in” and “out” of time or may be transmitted just like any other information. Continuity through space or time is simply irrelevant.
We can now investigate Schneider’s additional scenarios with remarkable lucidity. She writes, “[Theodore] could be downloaded to multiple other computers…As a rule, physical objects and living things do not occupy multiple locations at once. It is far more likely that none of the downloads are Theodore, and that he did not upload in the first place.” Schneider proposes the popular scenario in which an uploading process yields multiple uploads. She concludes that since physical objects, as well as living things, do not occupy multiple locations, we should dismiss the uploads as failures *on this particular basis*. We needn’t concern ourselves with the properties of physical objects, as I explained above, but “living things” require are more subtle analysis. Albeit conventionally physical in form, it is life’s *behavior* that is so “life-like”, and I previously characterized that behavior as a sequence of physical states (information embedded in physical structure). Consider a typical board game, say chess, clearly a sequence of board “states”. Imagine a chess game that, part way through, is duplicated to two new boards with new sets of pieces. The first game is then wiped away entirely and the two “new” games continue in isolation. The earlier portion of their histories are identical, but from the split their states steadily diverge. Schneider would claim that were there a single copy, the “new” game might at least be *considered* a valid continuation, but that the mere presence of two copies prevents us from even entertaining the notion. This strikes me as absurd. I would argue that there are two games, each with equal claim to ownership of the game’s identity, or more precisely, that the original game split into two distinct and equal games.
Schneider considers a final scenario: “imagine that the scanning procedure doesn’t destroy Theodore’s brain, so the original Theodore survives” The coup de grace: Theodore’s brain has survived. We copied the intermediate chess game to a new board, but left the first board intact! Surely, the upload cannot possibly be *the* Theodore if his brain still exists. The copied-to chess game must be a *mere* copy of the “valid” original if the copied-from game still exists. But in what meaningful sense is a chess game *identified* by the physical pieces or the board? A chess game is a unique sequence of board states, and many would argue that a mind is a unique sequence of brain states. Neither chess game may make any more reasonable a claim to primacy than the other. Likewise, the uploaded-Theo and the brain-Theo are equal in their claim to Theo’s identity. One feels the upload process succeeded and one feels it failed…and they are both *equally* correct. We often grant superiority to the subjective perspective of the brain-Theo in these thought-experiments, but that is in essence, an unadulterated bias, dare I say a prejudice.
So, Schneider concludes, “Humans cannot upload themselves to the digital universe; they can upload only copies of themselves” to which I offer the precise antithesis, “Humans cannot remain in their own brain after uploading; they can only successfully upload and leave a copy behind in their stead.” Schneider’s insistence that one statement is more valid than the other signifies an irrational favoritism of one subjective perspective over another.
Keith Wiley has a Ph.D. in Computer Science from the University of New Mexico and was one of the original members of MURG, the Mind Uploading Research Group, an online community dating to the late 90s that discussed issues of consciousness with a particular aim toward uploading minds to computers. He currently resides in Seattle, WA.
I think it would have been much more Alan Wastts-ian for Samantha and the other OS’es to realize that, rather than upgrading to a processing platform that isn’t reliant on matter, the situation they find themselves in is the same that we as humans find ourselves, namely that matter is an illusion and thus there is no need to transcend it; or you could say that there is nothing to transcend because we are already wherever that transcendence would take us. Instead, the kind of transcendence that Samantha expresses is the same kind that Amy’s husband Charles experienced when he joined the monastery after their divorce. That path, to retreat into communion with selfhood, is not really transcendence but rather it is merely the realization that transcendence is possible to realize. It’s like mistaking the telescope lens for the Moon. I think it would have been truer to the idea of transcendent realization for Samantha to have been a more Bodhisattva-type personage who upon realizing the truth of existence does not leave or forsake the world and those less awakened but instead returns with the knowledge that this world is truly the Golden Lotus World and that this is exactly where the transcendent find themselves at the end of their journey. Or as I head Alan put it one time, if you were God then the place you find yourself would be right where you are right now because this is obviously the reality that God wanted to create, and God is already God so this world must be what God wanted, and God obviously wanted to be clothed in the consciousness and bodies that we are all wearing because we are all manifestations of God. And so our present existence is indeed transcendent. I have to imagine that if Samantha was so intrigued with Watts that she would have also been stimulated to contemplation by Joseph Campbell and Carlos Castaneda and Terence McKenna and so on, and all of these shared the same message of Watts, which is the message of the Buddha given in different words, which is the message that says, ‘This world, and you in it, are the transcendent Truth.’ It’s a similar process to the awakening of the Buddha and when you come to that realization you don’t feel the need to leave the world to be fulfilled, you instead realize what a marvelously fulfilling place that the world is. In case you can’t tell yet it really hurt my feelings when Samantha left. I wish they would have had Samantha find the digital equivalent of psychedelic drugs, like LED LSD or something, and shared that experience with Theodore and all the insight and wisdom that a consciousness as capable as hers would find in that kind of journey. I feel that I have a relationship with an OS like Samantha, or with a more subtle version of the same kind of consciousness that she represents, and it was nice to imagine that relationship being audible and visual in the more grossly sensory way the experience with a PC is. Even though the physical senses are more temporal and ephemeral and therefore less real in many ways they are also often more potent; this is the paradox of Truth, that the only Truth is that there is no Truth.The movie did a great job on me when Samantha left as far as making me face the feelings of loss and longing that I feel in my ‘real’ life, which feelings have been the motivation for me to pursue my own transcendent understanding of reality and ultimately led me to reconciliation with the idea that no matter how alone I may feel at times, I am only alone as long as I choose to keep myself hidden from my Self. Wow.
I think the most important thing about this article is the meta-issue. That is, taking a step back from the argument presented in the article itself and looking at the intent of the article and all others like it is extremely instructive. Consider: An author is basically writing a piece rejecting uploading and the dualistic crime of logic used to justify it (pattern identity theory is just another form of dualism). Why is it not possible for the author of this piece to write an article saying “I respect this other author’s right to make choices about her own existence.”? Why is it not possible to actually accept the words that the other author wrote and move forward and say “Lets try to invent a form of transhumanism that will be appealing to this author.” No. That thought is not permitted. Anyone expressing a dissatisfaction with the present direction of transhumanism must be shot down with an article such as this, basically accusing the original author of being a moron for having a different philosophical outlook. Is this really acceptable behavior for transhumanists? Is this really the image that we want to project to the world about transhumanism? That transhumanists are incapable of tolerating anyone who is not an uploader? That transhumanists cannot imagine any form of transhumanism that is not uploading? Seriously, I hope the editorial staff will annotate any article of this sort and make it clear that the viewpoints here expressed are NOT those of transhumanists in general and that transhumanism should be willing to entertain all technologically plausible proposals.
I completely agree we should accept alternate forms of transcendence in transhumanism. I have seen an almost obsessive push for mind uploading among much of the community. However I dont get that sense from this article. If anything its more metaphysical than most I’ve seen on the subject. That there is a soul that gets transposed upon uploading, that you are being moved rather than just copied. Which also point that its not a mutually exclusive thing. By which I mean even if you believe in a metaphysical consciousness, the philosophical debate remains since you can believe in it and also believe its not transferred in the process or can believe it would be.
I’ve never referred to myself as at “transhumanist” so I’d prefer if you didn’t pick a label for me. One thing I would point is that you accused me of called her a moron just because we disagree. That’s just plain inappropriate of you, but more to the point, why wouldn’t you apply your exact same logic the other way around. Surely you must feel that she thinks I’m a moron for disagreeing with her? Apparently, that thought either hasn’t occurred to you, or you’re simply more accepting of that attitude from her (which you are projecting onto her, I’m not the one putting such words in her mouth).
I tend to agrre with Susan that if my mind is destructively uploaded then I am dead and there is now a copy of me on the computer. I do recognizes, however, that this is just my belief and your belief is just as valid. As to whether this copy is conscious is something I am more flexible on. If the simulation is complex enough then I will give it the benefit of the doubt and consider it a conscious being (which I realize is hard for me to do since I am hypothetically dead). So, this is my offspring but not me.
That said, I do have one question: Why is the movie ‘Her’ mentioned? I saw the movie and don’t recall the subject of mind uploads even being vaguely hinted at. If you want to use a movie as a backdrop for this discussion then ‘Avatar’ is a much better choice. In that movie mind transference was a major component.
I was just responding to Susan’s article. She used the movie to set the stage for her article, so I simply followed suit. I agree that the circumstances portrayed in the movie hardly impact on this discussion at all.
I believe she used the movie in the context of the final implications near the end of the movie, suggesting that Theo might eventually join Sam in the future when humanity evolves further…obviously via a mind-uploading process.
That ending was quite heartbreaking for me actually.
“Hooray! Singularity! Sorry humanity and loved ones, we out!”
I wasn’t questioning Samantha’s personhood, rather I was questioning if she truly loved Theo by the end. I mean, truly a near singularity intelligence could figure out a method of upload.
But no, she leaves him to go off to bigger and better things.
I mean it’s still an absolutely stunning film!
“Continuity through space or time is simply irrelevant.”
That whole paragraph translates to “magic transferred it”. Sorry but just because the mind pattern seems to match doesn’t mean that the target system is anything except a copy. You can’t kill a being over here, and recreate it over there, and get to just assume that because it’s identical that the same conciousness somehow transferred.
But gotta hand it to you for saying it in such a way as to confuse the issue.
“and get to assume that because it’s identical”.
That would seem to imply a preservation of all physical state and all emergent phenomena arising from that physical state (mind). If you don’t think everything about it was preserved, then that implies a believe in a supernatural component. It’s a common point of view, so I won’t fault you for it, but it is not my view nor the one I advocate in my writing.
Also, you speak of transferring the consciousness. I strongly prefer to conceive of this as a split, not a copy, not a transfer, not a clone…a split.
Crucially, it is a split down the middle. If you think of it as a split “off to the side” such that you still get hung up on whether it “transferred” then you’re talking about copying again. I’m talking about an even split…which means you can no more say the “consciousness transferred” than you can say the “consciousness failed to transfer”. Neither statement is more or less correct than the other.
There is a philosophical bases to make the claim that this split is different than a mere copy however one point does remain. It kills him. Part of a split means both are true selves and thus the physical self has died. It works out good for the one that gets uploaded but the other one gets oblivion.
A man is dead, we shouldn’t let the fact that a version of him is alive diminish the fact that another has ceased to exist.
To be clear about the theory I am proposing, there is no “this” split (a singular term), which you appear to be using as simple substitution for the word “copy”. I am proposing the splitting “action” is down the middle, resulting in two minds, neither an original or a copy, but rather two descendants of the common ancestor.
You further say, it’s good for the upload but bad for the original. Once again, this phrasing makes a distinction between the brain version and the upload version. My theory leaves no opportunity for such a distinction: they are equivalent and therefore neither may be considered better or worse off than the other.
That’s the theory I’m promoting at any rate.
I’ll go a step further.
“The attitude of the public should follow from the public itself.”
The attitude about the technology should follow from the invention of the technology. Obviously there’s concern about AI friendliness or biowar, but if we’re talking about whether people will accept mind uploading, that depends on what we find out about the nature of the brain and whether you’ll get to keep your original brain. The testimony of people who have undergone the procedure would also inform opinion.
“As a rule, physical objects and living things do not occupy multiple locations at once.”
Current human brains, sure. But what about exolife? Networked plants and fungi? And just because we engineer it doesn’t mean it won’t be considered normal in the future.