[Continuing coverage of the 2010 H+ Summit at Harvard.]

Next up is Patrick Hopkins on “Why Uploading Will Not Work” (bio, slides). A few days ago on this blog, guest-poster Mark Gubrud previewed Hopkins’s presentation at length.

As Gubrud described, Hopkins looks at the language used to describe mind uploading. What are the metaphors we use when speaking about it? The first is location: the mind is “in” or “within” a brain, and can be put “onto” a computer. The second is motion: the mind can be “moved,” “transferred,” “put” into a computer. And the third is substance: the mind is a thing that can be moved from one “receptacle” to another. But, Hopkins asks, do these metaphors really work? Is the mind truly an object that is housed “inside” a brain and can be “moved” to another “receptacle”? According to naturalist theories of mind, no. The positions that do think this, Hopkins says, are basically religious, relying on notions of souls, spirits, and ghosts.

Hopkins tries to absolve uploading advocates from blame; he says that they have just inherited this language from religion. I think it’s far more likely that they’re inheriting language and concepts from the discipline that gives rise to the notion of “uploading” in the first place: computer science. Computers are heavily dualistic systems, and transhumanists think the mind/brain is a computer, so they treat it as dualistic too.

Hopkins anticipates the rebuttal that this language is just metaphorical. But, he says, central to the idea of uploading is that personal identity is preserved. So the question is, does copying preserve identity? Is copying the same thing as transferring, as literally moving a mind? He sas no: copying creates something that is exactly structurally and behaviorally similar to the original, but that is not the same as identity. The copied mind has a different history, and is made of different matter; we can metaphysically tell the difference (as usual, see SMBC). If you want to believe that the mind is a pattern, he says, then it’s important to know that a pattern is not an object that can be plucked out and moved; it’s a way of organizing matter.

He describes a familiar scenario from the philosophy of mind: You’re sitting in a room and someone holds a gun to your head and says he’s about to shoot you, but before he does that he’s going to copy your mind into the other room. You’d still be unsettled, but maybe you’d be okay because you’d think that you would just go to sleep in one room and wake up in another. But what if the gunman then said “just kidding, I’m not going to shoot you, but I still made the copy.” It couldn’t be you in the other room, then, could it? Well your relationship to the mind in the other room is no different than it was a moment earlier; the only difference is that the gun is no longer at your temple. Mind uploading, Hopkins concludes, will not work as we like to think it will. (He doesn’t say it explicitly, but basically what he’s demonstrated is that psychological continuity is not all that is required for personal identity.)

Patrick Hopkins provides what is easily the best talk of the conference so far — he manages to convey sophisticated ideas effectively and concisely in a ten-minute slot that few other speakers have been able to own. And his message is convincing. Again, I wish the conference had put far more emphasis on talks of this level of thoughtfulness and speakers who were this effective.

I do have a few quibbles, though. First, Hopkins either misrepresents or misunderstands the significance of the argument he presents. To say that “uploading won’t work” makes it sound like he’s presenting a philosophical case for why we couldn’t have machines that are conscious, and whose consciousness very closely resembles that of existing persons. But his argument is based on the premise that we could. His conclusion is just that the results wouldn’t be as clean and transparent as everyone assumes.

So Hopkins’s claim is that a mind cannot be separated from a body and continued. But that is not quite the same as claiming that a mind cannot be copied. What if it could — what if a duplication were possible? Hopkins offers no consideration to the huge moral dilemmas that would arise if such beings were somehow created. If it were somehow technically possible, such duplicate beings might well consider themselves to have a continuous personal identity, complete with memories, thoughts, and feelings — only their memories, thoughts, and feelings about their own history of self would be false. The identity of the original being would be thrown into chaos just by the fact of its duplicate’s existence. How then would we treat such beings? Could we hold the copy responsible for crimes that it remembers having committed, but did not? Could we deny it credit for accomplishments it thinks it made but did not? These questions would become impossible to answer, and we would find many of the bases for our legal and social order similarly thrown into chaos, and impossible to resolve.

Maintaining continuous personal identity (and other really fundamental aspects of consciousness and mind) is not simply a philosophical matter of recognizing the necessary components, but a practical matter of maintaining them, socially and as lived lives. The conclusion Hopkins should arrive at by the end of his talk is not “this is why uploading won’t work” but “this is why we shouldn’t do it.”

6 Comments

  1. Ari,

    Your account of Hopkins's talk, and the posted final version of his slides, confirm that his basic argument is the same one I advanced in my paper and talk at the 2003 tranhumanist conference: that the language used by uploading proponents is always dualistic and always involves some term for an object which is separable from the body and which carries or constitutes the "true identity" of the person, i.e. is synonymous with "soul."

    Being a professional philosopher, however, Hopkins seems to miss the point that even the term "personal identity," as it is typically used in these discussions, is just another stand-in for "the soul."

    In hindsight, because I framed my paper as a frontal attack on this notion of "identity," people may have missed the central argument about dualism, which Hopkins presents very nicely.

    However, by the same token, I suspect that philosophers will dispute whether Hopkins is correct in his "account of personal identity," which he argues is not a thing that is transferable, but which still appears to be a thing, about which there is some definite fact about its transferability. Thus implicitly allowing that he might be wrong about this, he invites further argument — and actually, there is a lot of literature out there for him to deal with, such as the arguments of Derek Parfit which commenter "Carl" pointed to.

    Whereas I argue that "identity" resides only in the mind of the identifier, and that arguments for uploading or "identity transfer" as I put it, are a form of magic. This magic is successful to the extent that it manipulates the subject (who identifies things) so that the subject is led to identify whatever object is supposed to have had an identity transferred to it with whatever object's identity was supposedly transferred. The usual tool of this magical technology is
    the conscious or unconscious assumption that some "true identity," (or soul, distinct from the body, when referring to persons), resides in the object rather than in the subject, so that the subject may be persuaded, by some sleight-of-hand, that this object-within-the-object (it's "identity") has been or would be transferred.

    I know that philosophers think they mean something different by the word "identity." They say "A is identical to B" means "A and B are the same thing, have all the same properties," etc. I am talking about what their minds are really doing when they think and argue about this.

    In other words, a "correct account of personal identity" would be framed in terms of cognition and psychology, i.e. the structure of our minds, rather than in a search for external eternal truths.

    And I think a correct account of uploading is given by its proponents when they describe in physical and technical terms what it is that they propose to do. I am persuaded that uploading is possible, in principle and probably in practice. However, it would almost certainly require killing you, in order to disassemble the brain and characterize it at the molecular level.

    I guess I'm just like the "engineer" in that hilarious cartoon you linked.

  2. The question you ask, what if copying were possible, is of interest not only in the context of uploading, but more immediately, in the context of AI. As you point out, computer programs are portable, and functionally humanoid AI programs should be, too.

    So, if we are able in the relatively near future (relatively near compared with 3D molecular level mapping and modeling of an object the size of the brain) to create human-like "minds" in artificial computational substrates, they would likely be 100% clonable and transportable.

    I say "likely" because in some possible neuromorphic architectures, readout/readin of the "mind" would impose an extra burden, requiring extra circuitry. But anyone who thinks a humanoid AI would be useful is likely to think its clonability would be useful and worth the extra price.

    Since that may well be possible in the next few decades, it's worth further thought. What can we say about minds that are genuinely clonable? What would it be like to be one?

    Would they happily be "migrated" from one "platform" to another, and think nothing of it? Or would they periodically erupt in digital screaming at the sheer existential horror of it all? It might depend on how intelligent they are. I sometimes think the answer to the Fermi paradox may be that civilizations become so intelligent they come to realize they have absolutely no reason to do anything, even go on existing. In any case, I am sure that, from a Darwinian perspective, intelligence is way overrated.

    I think there are stronger reasons for not creating humanoid AI. It's hard enough to make computers trustworthy as they are. If they were made to think and act like people….

    By the way, Ari, it would be interesting to hear about how Hopkins's talk was received. Any objections from the floor, for example?

  3. Gubrud said "Or would they periodically erupt in digital screaming at the sheer existential horror of it all?"

    Why aren't you screaming right now, since existence in our reality is phenomenologically just as similar?

    Guburd also said, "I sometimes think the answer to the Fermi paradox may be that civilizations become so intelligent they come to realize they have absolutely no reason to do anything, even go on existing."

    Our civilization has been at this point for a couple hundred years (at least a growing educated percentage of our civilization), and I haven't ran into anyone who advocate extinction of our species. Those truly depressed end up eliminating themselves only, why should they spend the extra effort otherwise?

    And on Patrick Hopkins' talk, well, I was underwhelmed. He brought nothing new to the discussion. Why can't any anti-uploaders ever address the fact that our 'minds' are being copied all the time, changing matter and structure continuously?

  4. nonzero,

    Perhaps the only reason I'm neither screaming nor suicidal is just that I'm not some kind of pure intellect separated from my bodily desires and emotions.

    Our bodies, including our brains, are continually exchanging matter and changing structure. That's not quite the same as "being copied all the time," since it is a continuous and gradual process. So what? So nothing; I'm just stating the facts.

  5. @Gubrud said:"Our bodies, including our brains, are continually exchanging matter and changing structure. That's not quite the same as "being copied all the time," since it is a continuous and gradual process."

    And what would constitute something that wasn't a 'continuous and gradual process'? I'm not advocating we stop the flow of time, now that would be impossible! Are you saying there is some speed limit the brain obeys which, once violated, ceases to result in consciousness? If I amputate one arm and replace it with a prosthetic, then a week later do the same to the other arm, is there any difference in the resulting human compared with doing both procedures at the same time? Is the brain any different? Yes, you will experience different scenarios depending on which route you took, but you'd still end up as a continuation of your self.

    Call PETA, because I think I'm beating a dead horse here.

  6. nonzero asked:

    "Are you saying there is some speed limit the brain obeys which, once violated, ceases to result in consciousness?"

    Nope. I'm just saying that you should try to describe things as accurately as possible if you want to think as clearly and accurately as possible.

Comments are closed.