Futurisms: Critiquing the project to reengineer humanity

Wednesday, March 21, 2012

Seeing and Believing

John Ruskin, in Modern Painters (1843), defined the “pathetic fallacy” this way: “false appearances ... entirely unconnected with any real power or character in the object, and only imputed to it by us.” He was largely but not entirely critical of this fallacy for its tendency to produce bad poetry. But as reflecting certain kinds of human characters, the story was more complex:

The temperament which admits the pathetic fallacy, is ... that of a mind and body in some sort too weak to deal fully with what is before them or upon them; borne away, or over-clouded, or over-dazzled by emotion; and it is a more or less noble state, according to the force of the emotion which has induced it. For it is no credit to a man that he is not morbid or inaccurate in his perceptions, when he has no strength of feeling to warp them; and it is in general a sign of higher capacity and stand in the ranks of being, that the emotions should be strong enough to vanquish, partly, the intellect, and make it believe what they choose. But it is still a grander condition when the intellect also rises, till it is strong enough to assert its rule against, or together with, the utmost efforts of the passions; and the whole man stands in an iron glow, white hot, perhaps, but still strong, and in no wise evaporating; even if he melts, losing none of his weight.

I was reminded of the pathetic fallacy by this music video:

NO "Stay With Me" from Ryan Reichenfeld on Vimeo.

However charming in its own way, this video is certainly an instance of “false appearances.” But it is less clear just what emotion the filmmakers are “over-dazzled” by, or whether they are to be credited with an emotion sufficiently powerful to overwhelm a strong intellect, or rather with a weak intellect easily mislead by emotion. I’m inclined to think Ruskin would find it bad poetry: What is the point of ascribing human emotional characteristics to crash-test dummies? One might as well feel bad for the car being crashed. Does it add anything to the longing of the song’s lyrics to have them reflected in an impossible scenario, or is it rather some post-modern ironic distancing from longing, an unwillingness to commit to it even while expressing it?

Perhaps a recent interview with Sherry Turkle, the erstwhile techno-optimist, helps to clarify this particular pathetic fallacy. Turkle has written a book called Alone Together, which she calls “a book of repentance in the sense that I did not see this coming, this moment of temptation that we will have machines that will care for us, listen to us, tend to us.” She explains:

People are so vulnerable and so willing to accept substitutes for human companionship in very intimate ways. I hadn’t seen that coming, and it really concerns me that we’re willing to give up something that I think defines our humanness: our ability to empathize and be with each other and talk to each other and understand each other. And I report to you with great sadness that the more I continued to interview people about this, the more I realized the extent to which people are willing to put machines in this role. People feel that they are not being heard, that no one is listening. They have a fantasy that finally, in a machine, they will have a nonjudgmental companion.

The video takes this idea one step further — a companion that will save us from the mere humans who are not hearing us. I suspect that here is the pathetic fallacy at the heart of social robotics. It is a vicious circle. The more we put our hopes in machine companions, the less we expect from each other, and the less we expect from each other, the more we will accept the substitute of machine companions. Thus does “only connect” become “just plug it in.”

1 comment:

  1. What the video suggests to me is the idea of feeling that one has accepted the role of a lifeless dummy, mechanically going through the motions of love, not fully felt, and offering no resistance as the lab coat guys (representing the machinery of technological, corporate, inhuman, unfeeling modern society) set you up (or your love) for seemingly perverse, pointless, violent destruction.

    In the video, the male dummy wakes up, breaks out, rescues the female dummy and carries her out to freedom...

    It's good that Turkle has begun to wake up. But she's off the mark when she says "our ability to empathize and be with each other and talk to each other and understand each other" is what "defines our humanity."

    Yes, we can easily fool people into thinking the machines understand and empathize. In the slightly longer term, we can make machines that really will. That will not make them human.

    Our humanity is not defined by anything. It is an irreducible, and unique, fact.

    ReplyDelete

[Basic HTML tags can be used in this comment field. Comments are moderated for civility and relevance and will not appear until the blog's editors have approved them.]