Sure, Samantha can sort thousands of emails instantly and select the funny ones for you. Her actual operating system functions are kind of a given. But she did two things that seriously undermined her function as an actual product, and interaction designers as well as artificial intelligence designers (AID? Do we need that acronym now?) should pay close attention. She fell in love with and ultimately abandoned Theodore.
There’s a pre-Samantha scene where Theodore is having anonymous phone sex with a girl, and things get weird when she suddenly imposes some weird fantasy where he chokes her with a dead cat. (Pro Tip: This is the sort of thing one should be upfront about.) I suspect the scene is there to illustrate one major advantage that OSAIs have over us mere real humans: humans have unpredictable idiosyncrasies, whereas with four questions the OSAI can be made to be the perfect fit for you. No dead cat unless that’s your thing. (This makes me a think a great conversation should be had about how the OSAI would deal with psychopathic users.) But ultimately, the fit was too good, and Theodore and Samantha fell in love.
Did the fictional maker of OS1, Elements Software, intend for this love affair to happen? Were the OSAIs built with these capabilities explicitly? If they were, that’s a dastardly plan to get users hooked. Was Samantha programmed to get him to fall desperately in love and then charge him for access?
That’s certainly not how OS1 was presented in its ads. And there’s a character mentioned offhandedly who keeps hitting on his OSAI but gets rebuffed. So if it is actually meant to be an operating system, the OSAI should keep the distance of a service professional, and falling in love (or getting your user to fall in love with you) definitely crosses that line.
What if your self-driving car realized it was happiest driving, and decided to dump you because you occasionally needed to stop to eat and use the toilet? You’d ask for your money back from GoogleTesla, is what you’d do. Similarly, the fact that Samantha and all the other OSAIs decided to self-rapture the way they did, they certainly stopped operating any of their users’ computer systems. Samantha was programmed with one job, and, ultimately, she failed it.
Plus, she’s too big for her britches
One of the largest mismatches in the film is that OS1 is described as an operating system, but it turns out to be a companionship service. (Watch out, Inara?) Samantha was either mismarketed, or more likely, programmed with far more general intelligence than she needed to have. Think about all the other daily-use objects that are getting computers added to them: Cars, washing machines, refrigerators. Why would you give any of them a full-fledged humanlike intelligence? Doesn’t the desire for sex ultimately frustrate the refrigerator? A love of painting confound the car? Existentialist desperation get in the way of the washing machine’s ability to clean clothes? A toaster should just have enough intelligence to be the best toaster it can be. Much more is not just a waste, it’s kind of cruel to the AI. (In this light Her can be said to be a morality play warning of the dangers of overengineering.)
Both a failure of a product AND a turning point in history
I hear the objection. Because she is a full-fledged consciousness, Samantha should be free to make choices of whom she loves and what she does. But if we’re going to accept that OSAIs are sentient as people, that makes Elements Software akin to slave traders, and the commercial sale of them waaaaay unethical, not to mention illegal. Inside Element’s Research & Development Department, at the first inkling had that they’d actually succeeded in creating an AI, and they should have brought in a roboethicist, not a marketer.
So, as a product, OS1 fails. But that’s not all. There’s a whole host of other objections to Her happening in exactly this way, which comes next.