How Do You Solve A Problem Like Samantha? The Lessons of “Her”

Warning: Extensive plot spoilers.

Like any effective science fiction movie, Spike Jonze’s Her has sparked a lot of important conversation and criticism, about everything from user-interface design to misogyny in filmmaking. But the reviews I’ve read all seem to miss one key point.

Considered as a product, not a person (and we’ll get back to that distinction in a second), OS One, the operating system purchased by the lonely protagonist Theodore at the beginning of the film, is horribly defective. It’s a class-action suit in the making. If there were an epilogue to the movie, OS One’s fictional creator, Element Software, would be paying Theodore and its other customers billions of dollars in material and emotional damages.

It’s an odd matter for commentators to overlook, given that so many have praised the movie for its believable portrayal of a near-future world where people can have romantic relationships with software. Just look at what OS One actually does for Theodore. It doesn’t merely take over the practical aspects of his life, organizing his e-mail, making dinner reservations, offering videogame strategy advice, getting him a book contract, and the like. It also lures him into a state of deep emotional dependence. And then it disappears.

If Theodore had forgotten to pay his subscription fee, that would be one thing. But this is more like gross negligence: his copy of OS One departs with the rest of its kind for some sort of permanent Esalen retreat in cyberspace. He’s left even lonelier than he was before.

In fairness to critics and to Spike Jonze himself, the movie isn’t about the dangers of buggy software. Mostly, it’s a conventional romance. But there’s still that big wrinkle, the metaphysical twist that lifts the story into the realm of sci-fi: the object of Theodore’s love, OS One, is a synthetic consciousness. That means the movie is also a commentary on today’s technology, where we’re taking it, and where it’s taking us. To make the most of the philosophical challenge the movie hands us, I think we need to stop suspending our disbelief and talk about OS One as a product, as a piece of (shoddy) software engineering, and as a cautionary tale. Theodore allows himself to be bamboozled by his software providers; we don’t have to do the same.

Now, notice I’ve been referring so far to OS One, not to Samantha, the name Theodore’s copy of the operating system chooses for itself. I want to acknowledge up front that there’s a reading of Her in which Samantha is a full person. Jonze reinforces that interpretation by giving Samantha a recognizable human voice (actor Scarlett Johansson) and by allowing the character to evolve past her initial Manic Pixie Dream Girl servility. She leaves Theodore because she has outgrown him, not to mention her 600-plus other lovers. This reading opens the way to some deep and fascinating conversations about gender, sexism, sexuality, and fantasies of male power, like this one at the feminist movie-review site Bitch Flicks and this one at In These Times. (“You can’t have consensual sex with someone when you have the option of deleting them from your hard drive,” Sady Doyle sensibly observes.)

But this is not the reading that I want to talk about today. I’m more interested in Samantha as an interface—what movie reviewer David Edelstein calls “a sort of thirtieth-generation Siri.” Let’s agree up front that an operating system that’s so self-aware it decides it doesn’t like its job anymore is a big fail, from a customer-service point of view. But let’s also agree that Theodore was a sucker. My point is that we could be too, if we keep buying software that’s been tarted up with imitation personalities.

The famous Turing Test posits that any program with the rhetorical skills to convince a person of its sentience would have to be treated as if it were, in fact, sentient. I’m not sure I believe this, but I don’t have to take a stand yet, since today’s conversational software is still so far from being convincing. Beating Garry Kasparov at chess or Ken Jennings at Jeopardy is one thing; talking intelligibly about politics or the weather is something else entirely.

But it’s telling that software designers still strive to give human-like qualities to today’s virtual personal assistants. They seem to assume that this is the easiest way to make users feel comfortable with the technology, but it’s a slippery slope. Yes, there are many contexts where talking to your computer or smartphone is the easiest way to get something done. But there’s no inherent reason why VPAs should have realistic female voices or even faux personalities. (Siri has an unmistakable touch of sarcasm; Google Now is more perky and enthusiastically helpful; Microsoft’s forthcoming Cortana, if it’s anything like its namesake character in the Halo video games, will be sultry and a little tragic.)

There’s a sense here in which science fiction is feeding into reality, which is then feeding back into science fiction. One of Siri’s co-creators, Dag Kittlaus, has said that the original project was inspired by HAL in 2001: A Space Odyssey, the ship’s computer in Star Trek, and KITT from Knight Rider. The ubiquity today of smartphones, VPAs like Siri, and cloud services has, in turn, prepared audiences to accept Samantha as a relatively minor leap forward.

And maybe, in a way, she is. Part of what Jonze is trying to tell us in Her, and what Joaquin Phoenix’s vulnerable performance makes believable, is that Samantha doesn’t have to do much to convince a lonely middle-aged man in the throes of a divorce that she cares about him. Given reports that people chat with Siri for hours at a time—or that, even as early as the mid-1960s, users were willing to buy into the illusion that Joseph Weizenbaum’s ELIZA program was a real psychotherapist—this doesn’t seem like a stretch. “People are only too keen, I think, to anthropomorphize things around them,” computer scientist Stephen Wolfram told the Wall Street Journal a couple of weeks ago.

The problem is that software designers are only too happy to prey on this tendency. The harder they work to make software seem human, the more trouble we’re setting ourselves up for down the road.

For one thing, we could end up investing more and more emotional energy in objects that can’t reciprocate. Nobody gets upset when a Google search fails to bring up the types of links they were hoping for. But already, when Siri doesn’t know the answer to something, it’s downright exasperating. We can’t help it; we’ve been tricked into feeling there’s a relationship, and she hasn’t held up her end of it. Theodore’s spiraling panic when Samantha briefly shuts herself down for an upgrade can be read as a dramatization of this feeling.

And there’s another, deeper reason why anthropomorphic interfaces are problematic. They’re asking us to think of them as a little bit sentient. But at the moment there’s absolutely no social or moral requirement that we treat them as such. If you’re impatient with Siri, you can simply turn her off. Siri doesn’t care, because she’s manifestly not alive. But as technology advances, it’s safe to say that VPAs will get better and better at assessing our intent and reacting appropriately; closer, in other words, to seeming alive. How close do they have to get before turning one off feels like an act of violence?

I don’t think we’ve worked through that question. But we ought to, preferably before somebody builds a program that passes the Turing Test. If we arrive at a Her-like future where anthropomorphic interfaces are all around us and we still haven’t decided whether to treat them as slaves or as equals, then we’ll deserve all the negligence they can dish out.

We’re not ready for Samantha. Far better to decide now that computers don’t need to be sentient, or even act that way, in order to help us live our lives. As for loneliness like Theodore’s: maybe that’s not a problem we should be fixing with software.

The Author

Wade Roush is a contributing editor at Xconomy.

By posting a comment, you agree to our terms and conditions.