Her interface components

Depending on how you slice things, the OS1 interface consists of five components and three (and a half) capabilities.

Her-earpiece

1. An Earpiece

The earpiece is small and wireless, just large enough to fit snugly in the ear and provide an easy handle for pulling out again. It has two modes. When the earpiece is in Theodore’s ear, it’s in private mode, hearable only by him. When the earpiece is out, the speaker is as loud as a human speaking at room volume. It can produce both voice and other sounds, offering a few beeps and boops to signal needing attention and changes in the mode.

Her-cameo

2. Cameo phone

I think I have to make up a name for this device, and “cameo phone” seems to fit. This small, hand-sized, bi-fold device has one camera on the outside an one on the inside of the recto, and a display screen on the inside of the verso. It folds along its long edge, unlike the old clamshell phones. The has smartphone capabilities. It wirelessly communicates with the internet. Theodore occasionally slides his finger left to right across the wood, so it has some touch-gesture sensitivity. A stripe around the outside-edge of the cameo can glow red to act as a visual signal to get its user’s attention. This is quite useful when the cameo is folded up and sitting on a nightstand, for instance.

Theodore uses Samantha almost exclusively through the earpiece and cameo phone, and it is this that makes OS1 a wearable system.

3. A beauty-mark camera

Only present for the surrogate sex scene, this small wireless (are we at the point when we can stop specifying that?) camera affixes to the skin and has the appearance of a beauty mark.

4. (Unseen) microphones

Whether in the cameo phone, the desktop screen, or ubiquitously throughout the environment, OS1 can hear Theodore speak wherever he is over the course of the film.

5. Desktop screen

Theodore only uses a large monitor for OS1 on his desktop a few times. It is simply another access point as far as OS1 is concerned. Really, there’s nothing remarkable about this screen. It is notable that there’s no keyboard. All input is provided by either voice, camera, or a touch gesture on the cameo.

Her-install01

If those are components to the interface, they provide the medium for her 3.5 capabilities.

Her capabilities

1. Voice interface

Users can speak to OS1 in fully-natural language, as if speaking to another person. OS1 speaks back with fully-human spoken articulation. Theodore’s older OS had a voice interface, but because of its lack of artificial intelligence driving it, the interactions were limited to constrained commands like, “Read email.”

2. Computer vision

Samantha can process what she sees through the camera lens of the cameo perfectly. She recognizes distinct objects, people, and gestures at the physical and pragmatic level. I don’t think we ever see things from Samatha’s perspective, but we do have a few quick close ups of the camera lens.

3. Artificial Intelligence

The most salient aspect of the interface is that OS1 is a fully realized “Strong” artificial intelligence.

It would like me to try and get to some painfully-crafted definition of what counts as either an artificial intelligence or sentience, but in this case we don’t really need a tight definition to help suss out whether or not Samantha is one. That’s the central conceit of the film, and the evidence is just overwhelming.

  • She has a human command of language.
  • She’s fully versed in the nuances of human emotion (and Theodore has a glut of them to engage).
  • She has emotions and can fairly be described as emotional. She has a sexual drive.
  • She has existential crises and a rich theory of mind. At one point she dreamily asks Theodore “What’s it like to be alive in that room right now?” as if she was a philosophical teen idly chatting with her boyfriend over the phone.
  • She commits lies of omission in hiding uncomfortable truths.
  • She changes over time. She solves problems. She learns. She creates.
  • She has a sense of humor. When Theodore tells her early on to “read email” in the weird toComputerese (my name for that 1970s dialect of English spoken only between humans and machines) grammar he had been using with his old operating system, Samantha jokingly adopts a robotic voice and replies, “OK. I will read the email for Theodore Twombly” and gets a good laugh out of him before he apologizes.

Pedants will have some fun discussing whether this is apt but I’m moving forward with it as a given. She’s sentient.

3.5 An “operating system”

This item only counts as half a thing because Theodore uses it as an operating system maaaybe twice in the film. Really, this categorization is a MacGuffin to explain why he gets it in the first place, but it has little to no other bearing on the film.

scarlettjoclippy

What’s missing?

Notably missing in OS1 is a face or any other visual anthropomorphic aspect. There’s no Samantha-faced Clippy. Notice that she’s very carefully disembodied. Jonze does not spend screen time close up on her camera lens, like Kubrick did with HAL’s unblinking eye. Had he done so, it would have given us the impression that she’s somewhere behind that eye. But she’s not. Even in the prop design, he makes sure the camera lens itself looks unremarkable, neutral, and unexpressive, and never gets a lingering focus.

Her “organs,” like the cameo and earpiece, don’t even connect together physically at all. Speaking as she does through the earpiece means she doesn’t exist as a voice from some speaker mounted to the wall. She exists across various displays and devices, in some psychological ether between them. For us, she’s a voiceover existing everywhere at once. For Theodore, she’s just a delightful voice in his head. An angel—or possibly a ghost—borne unto him.

This disembodiment (both the design and the cinematic treatment) frees Theodore and the audience from the negative associations of many other sci-fi intelligences, robots, and unfortunate experiments in commercial artificial intelligence that got trapped in the muck of the uncanny valley. One of the main reasons designers have to be careful about invoking the anthropomorphic sense in users is because it will raise expectations of human capabilities that modern technology just can’t match. But OS1 can match and exceed those expectations, since it’s an AI in a work of fiction, so Jonze is free of that constraint.

And having no visual to accompany a human-like voice allows users to imagine our own “perfect” embodiment to the voice. Relying on the imagination to provide the visuals makes the emotional engagement greater, as it does with our crushes on radio personalities, or the unseen monster in a horror movie. Movies can never create as fulfilling an image for an individual audience member as their imagination can. Theodore could picture whatever he wanted to–even if he wanted to–to accompany Samantha’s computer-generated voice. Unfortunately for the audience, Jonze cast Scarlett Johansen, a popular actress whose image we are instantly able to recall upon hearing her husky, sultry voice, so the imagined-perfection is more difficult for us.

This is just the components and capabilities. Tomorrow we’ll look at some of the key interactions with OS1.

The answer does not program

LogansRun224

Logan’s life is changed when he surrenders an ankh found on a particular runner. Instead being asked to identify, the central computer merely stays quiet a long while as it scans the objects. Then its lights shut off, and Logan has a discussion with the computer he has never had before.

The computer asks him to “approach and identify.” The computer gives him, by name, explicit instructions to sit facing the screen. Lights below the seat illuminate. He identifies in this chair by positioning his lifeclock in a recess in the chair’s arm, and a light above him illuminates. Then a conversation ensues between Logan and the computer.

LogansRun113

The computer communicates through a combination of voice and screen, on which it shows blue text and occasional illustrative shapes. The computer’s voice is emotionless and soothing. For the most part it speaks in complete sentences. In contrast, Logan’s responses are stilted and constrained, saying “negative” instead of “no,” and prefacing all questions with the word, “Question,” as in, “Question: What is it?”

On the one hand it’s linguistically sophisticated

Speech recognition and generation would not have a commercially released product for four years after the release of Logan’s Run, but there is an odd inconsistency here even for those unfamiliar with the actual constraints of the technology. The computer is sophisticated enough to generate speech with demonstrative pronouns, referring to the picture of the ankh as “this object” and the label as “that is the name of the object.” It can even communicate with pragmatic meaning. When Logan says,

“Question: Nobody reached renewal,”

…and receives nothing but silence, the computer doesn’t object to the fact that his question is not a question. It infers the most reasonable interpretation, as we see when Logan is cut off during his following objection by the computer’s saying,…

“The question has been answered.”

Despite these linguistic sophistications, it cannot parse anything but the most awkwardly structured inputs? Sadly, this is just an introduction to the silliness that is this interface.

Logan undergoes procedure “03-033,” in which his lifeclock is artificially set to blinking. He is then instructed to become a runner himself and discover where “sanctuary” is. After his adventure in the outside performing the assignment he was forced to accept, he is brought in as a prisoner. The computer traps him in a ring of bars demanding to know the location of sanctuary. Logan reports (correctly) that Santuary doesn’t exist.

LogansRun206

LogansRun205

hologram

On the other hand, it explodes

This freaks the computer out. Seriously. Now, the crazy thing is that the computer actually understands Logan’s answer, because it comments on it. It says, “Unacceptable. The answer does not program [sic].” That means that it’s not a data-type error, as if it got the wrong kind of input. No, the thing heard what Logan was saying. It’s just unsatisfied, and the programmer decided that the best response to dissatisfaction was to engage the heretofore unused red and green pixels in the display, randomly delete letters from the text—and explode.That’s right. He decided that in addition to the Dissatisfaction() subroutine calling the FreakOut(Seriously) subroutine, the FreakOut(Seriously) subroutine in its turn calls Explode(Yourself), Release(The Prisoner), and the WhileYoureAtItRuinAllStructuralIntegrityoftheSurroundingArcitecture() subroutines.

LogansRun221

Frankly, if this is the kind of coding that this entire society was built upon, this whole social collapse thing was less deep commentary and really just a matter of computer Darwinism catching up with them.

LogansRun223

LogansRun225

LogansRun226

LogansRun227

Siege Support

GitS-Aramaki-03

When Section 9 launches an assault on the Puppet Master’s headquarters, Department Chief Aramaki watches via a portable computer. It looks and behaves much like a modern laptop, with a heavy base that connects via a hinge to a thin screen. This shows him a live video feed.

The scan lines on the feed tell us that the cameras are diegetic, something Aramaki is watching, rather than the "camera" of the movie we as the audience are watching. These cameras must be placed in many places around the compound: behind the helicopter, following the Puppet Master, to the far right of the Puppet Master, and even floating far overhead. That seems a bit far-fetched until you remember that there are agents all around the compound, and Section 9 has the resources to outfitted all of them with small cameras. Even the overhead camera could be an unoticed helicopter equipped with a high-powered telephoto lens. So stretching believability, but not beyond the bounds of possibility. My main question is, given these cameras, who is doing the live editing? Aramaki’s view switches dramatically between these views as he’s watching with no apparent interaction.

A clue comes from his singular interaction with the system. When a helicopter lands in the lawn of the building, Aramaki says, "Begin recording," and a blinking REC overlay appears in the upper left and a timecode overlay appears in the lower right. If you look at the first shot in the scene, there is a soldier next to him hunched over a different terminal, so we can presume that he’s the hands-on guy, executing orders that Aramaki calls out. That same tech can be doing the live camera switching and editing to show Aramaki the feed that’s most important and relevant.

GitS-Aramaki-11

That idea makes even more sense knowing that Aramaki is a chief, and his station warrants spending money on an everpresent human technician.

Sometimes, as in this case, the human is the best interface.

Section No9′s crappy security

GitS-Sec9_security-01

The heavily-mulleted Togusa is heading to a company car when he sees two suspicious cars in the parking basement. After sizing them up for a moment, he gets into his car and without doing anything else, says,

"Security, whose official vehicles are parked in the basement garage?"

It seems the cabin of the car is equipped to continuously monitor for sound, and either an agent from security is always waiting, listening at the other end, or by addressing a particular department by name, a voice recognition system instantly routs him to an operator in that department, who is able to immediately respond:

"They belong to Chief Nakamura of the treaties bureau and a Dr. Willis."

"Give me the video record of their entering the building."

In response, a panel automatically flips out of the dashboard to reveal a monitor, where he can watch the the security footage. He watches it, and says,

"Replay, infrared view"

After watching the replay, he says,

"Send me the pressure sensor records for basement garage spaces B-7 and 8."

The screen then does several things at once. It shows a login screen, for which his username is already supplied. He mentally supplies his password. Next a menu appears on a green background with five options: NET-WORK [sic], OPTICAL, PRESSURE, THERMO, and SOUND. "PRESSURE" highlights twice with two beeps. Then after a screen-green 3D rendering of Section 9 headquarters builds, the camera zooms around the building and through floorplans to the parking lot to focus on the spaces, labeled appropriately. Togusa watches as pea green bars on radial dials bounce clockwise, twice, with a few seconds between.

The login

Sci-fi logins often fail for basic multifactor authentication, and at first it appears that this screen only has two parts: a username and password. But given that Togusa connects to the system first vocally and then mentally, it’s likely that one of these other channels supplies a third level of authentication. Also it seems odd to have him supply a set of characters as the mental input. Requiring Togusa to think a certain concept might make more sense, like a mental captcha.

The zoom

Given that seconds can make a life-or-death difference and that the stakes at Section 9 are so high, the time that the system spends zooming a camera around the building all the way to the locations is a waste. It should be faster. It does provide context to the information, but it doesn’t have to be distributed in time. Remove the meaningless and unlabeled dial in the lower right to gain real estate, and replace it with a small version of the map that highlights the area of detail. Since Togusa requested this information, the system should jump here immediately and let him zoom out for more detail only if he wants it or if the system wants him to see suspect information.

The radial graphs

The radial graphs imply some maximum to the data, and that Nakamura’s contingent hits some 75% of it. What happens if the pressure exceeds 37 ticks? Does the floor break? (If so, it should have sent off structural warning alarms at the gate independently of the security question.) But presumably Section 9 is made of stronger stuff than this, and so a different style of diagram is called for. Perhaps remove the dial entirely and just leave the parking spot labels and the weight. Admittedly, the radial dial is unusual and might be there for consistency with other, unseen parts of the system.

Moreover, Togusa is interested in several things: how the data has changed over time, when it surpassed an expected maximum, and by how much. This diagram only addresses one of them, and requires Togusa to notice and remember it himself. A better diagram would trace this pressure reading across time, highlighting the moments when it passed a threshold. (This parallels the issues of medical monitoring highlighted in the book, Chapter 12, Medicine.)

SECURITY_redo

Even better would be to show this data over time alongside or overlaid with any of the other feeds, like a video feed, such that Togusa doesn’t have to make correlations between different feeds in his head. (I’d have added it to the comp but didn’t have source video from the movie.)

The ultimately crappy Section No9 security system

Aside from all these details of the interface and interaction design, I have to marvel at the broader failings of the system. This is meant to be the same bleeding-edge bureau that creates cyborgs and transfers consciousnesses between them? If the security system is recording all of this information, why is it not being analyzed continuously, automatically? We can presume that object recognition is common in the world from a later scene in which a spider tank is able to track Kunasagi. So as the security system was humming along, recording everything, it should have also been analyzing that data, noting the discrepancy between of the number of people it counted in any of the video feeds, the number of people it counted passing through the door, and the unusual weight of these "two" people. It should have sent a warning to security at the gate of the garage, not relied on the happenstance of Togusa’s hunch and good timing.

This points to a larger problem that Hollywood has with technology being part of its stories. It needs heroes to be smart and heroic, and having them simply respond to warnings passed along by smart system can seem pointedly unheroic. But as technology gets smarter and more agentive, these kinds of discrepancies are going to break believability and get embarassing.

Alphy

Barbarella-014

Barbarella’’s onboard conversational computer is named Alphy. He speaks with a polite male voice with a British accent and a slight lisp. The voice seems to be omnidirectional, but confined to the cockpit of the space rocket.

Goals

Alphy’’s primary duties are threefold. First, to obey Barbarella’’s commands, such as waking her up before their approach to Tau Ceti. Second, autopilot navigation. Third, to report statuses, such as describing the chances of safe landing or the atmospheric analysis that assures Barbarella she will be able to breathe.

Display

Alphy

Whenever Alphy is speaking, a display panel at the back of the cockpit moves. The panel stretches from the floor to the ceiling and is about a meter wide. The front of the panel consists of a large array of small rectangular sheets of metal, each of which is attached on one side to one of the horizontal bars that stretch across the panel. As Alphy talks, individual rectangles lift and fall in a stochastic pattern, adding a small metallic clacking to the voice output. A flat yellow light fills the space behind the panel, and the randomly rising and falling rectangles reveal it in mesmerizing patterns.

The light behind Alphy’’s panel can change. As Barbarella is voicing her grave concerns to Dianthus, Alphy turns red. He also flashes red and green during the magnetic disturbances that crash her ship on Tau Ceti. We also see him turn a number of colors after the crash on Tau Ceti, indicating the damage that has been done to him.

In the case of the conversation with Dianthus, there is no real alert state to speak of, so it is conceivable that these colors act something like a mood ring, reflecting Barbarella’’s affective state.

Language

Like many language-capable sci-fi computer systems of the era, Alphy speaks in a stilted fashion. He is given to “computery” turns of phrases, brusque imperatives, and odd, unsocialized responses. For example, when Barbarella wishes Alphy a good night before she goes to sleep, he replies, “Confirmed.”

Barbarella even speaks this way when addressing Alphy sometimes, such as when they risk crashing into Tau Ceti and she must activate the terrascrew and travel underground. As she is piloting manually, she says things like, “Full operational power on all subterranean systems,” “45 degree ascent,” and “Quarter to half for surfacing.”

Barbarella-Alphy-color-04

Nonetheless, Alphy understands Barbarella completely whenever she speaks to him, so the stilted language seems very much like a convention than a limitation.

Anthropomorphism

Despite his lack of linguistic sophistication, he shows a surprising bit of audio anthropomorphism. When suffering through the magnetic disturbances, his voice gets distressed. Alphy’’s tone also gets audibly stressed when he reveals that the Catchman has performed repairs “in reverse,” in each case underscoring the seriousness of the situation. When the space rocket crashes on Tau Ceti, Alphy asks groggily, ““Where are we?” We know this is only affectation because within a few seconds, he is back up to full functioning, reporting happily that they have landed, ““Planet 16 in the system Tau Ceti. Air density oh-point-oh-51. Cool weather with the possibility of stormy precipitations.”” Alphy does not otherwise exhibit emotion. He doesn’t speak of his emotions or use emotional language. This convention, too, is to match Barbarella’s mood and make make her more comfortable.

Agency

Alphy’s sensors seem to be for time, communication technology, self-diagnostics, and for analyzing the immediate environment around the ship. He has actuators to speak, change his display, supply nutrition to Barbarella, and focus power to different systems around the ship, including the emergency systems. He can detect problems, such as the “magnetic disturbance”, and can respond, but has no authority to initiate action. He can only obey Barbarella, as we hear in the following exchange.

Barbarella: What’s happening?
Alphy: Magnetic disturbances.
Barbarella: Magnetic disturbances?…Emergency systems!
Alphy: All emergency systems will now operate.

His real function?

All told, Alphy is very limited in what he can do. His primary functions are reading aloud data that could be dials on a dashboard and flipping switches so Barbarella won’t have to take her hands off of…well, switches…in emergency situations. The bits of anthropomorphic cues he provides to her through the display and language confirm that his primary goal is social, to make Barbarella’s adventurous trips through space not feel so lonely.

MedPod

Early in the film, when Shaw sees the MedPod for the first time, she comments to Vickers that, “They only made a dozen of these.” As she caresses its interface in awe, a panel extends as the pod instructs her to “Please verbally state the nature of your injury.”

Prometheus-087

The MedPod is a device for automated, generalized surgical procedures, operable by the patient him- (or her-, kinda, see below) self.

When in the film Shaw realizes that she’s carrying an alien organism in her womb, she breaks free from crewmembers who want to contain her, and makes a staggering beeline for the MedPod.

Once there, she reaches for the extended touchscreen and presses the red EMERGENCY button. Audio output from the pod confirms her selection, “Emergency procedure initiated. Please verbally state the nature of your injury.” Shaw shouts, “I need cesarean!” The machine informs her verbally that, “Error. This MedPod is calibrated for male patients only. It does not offer the procedure you have requested. Please seek medical assistance else–”

Prometheus-237

I’ll pause the action here to address this. What sensors and actuators are this gender-specific? Why can’t it offer gender-neutral alternatives? Sure, some procedures might need anatomical knowledge of particularly gendered organs (say…emergency circumcision?), but given…

  • the massive amounts of biological similarity between the sexes
  • the needs for any medical device to deal with a high degree of biological variability in its subjects anyway
  • most procedures are gender neutral

…this is a ridiculous interface plot device. If Dr. Shaw can issue a few simple system commands that work around this limitation (as she does in this very scene), then the machine could have just done without the stupid error message. (Yes, we get that it’s a mystery why Vickers would have her MedPod calibrated to a man, but really, that’s a throwaway clue.) Gender-specific procedures can’t take up so much room in memory that it was simpler to cut the potential lives it could save in half. You know, rather than outfit it with another hard drive.

Aside from the pointless “tension-building” wrong-gender plot point, there are still interface issues with this step. Why does she need to press the emergency button in the first place? The pod has a voice interface. Why can’t she just shout “Emergency!” or even better, “Help me!” Isn’t that more suited to an emergency situation? Why is a menu of procedures the default main screen? Shouldn’t it be a prompt to speak, and have the menu there for mute people or if silence is called for? And shouldn’t it provide a type-ahead control rather than a multi-facet selection list? OK, back to the action.

Desperate, Shaw presses a button that grants her manual control. She states “Surgery abdominal, penetrating injuries. Foreign body. Initiate.” The screen confirms these selections amongst options on screen. (They read “DIAGNOS, THERAP, SURGICAL, MED REC, SYS/MECH, and EMERGENCY”)

The pod then swings open saying, “Surgical procedure begins,” and tilting itself for easy access. Shaw injects herself with anesthetic and steps into the pod, which seals around her and returns to a horizontal position.

Why does Shaw need to speak in this stilted speech? In a panicked or medical emergency situation, proper computer syntax should be the last thing on a user’s mind. Let the patient shout the information however they need to, like “I’ve got an alien in my abdomen! I need it to be surgically removed now!” We know from the Sonic chapter that the use of natural language triggers an anthropomorphic sense in the user, which imposes some other design constraints to convey the system’s limitations, but in this case, the emergency trumps the needs of affordance subtleties.

Once inside the pod, a transparent display on the inside states that, “EMERGENCY PROC INITIATED.” Shaw makes some touch selections, which runs a diagnostic scan along the length of her body. The terrifying results display for her to see, with the alien body differentiated in magenta to contrast her own tissue, displayed in cyan.

Prometheus-254

Prometheus-260

Shaw shouts, “Get it out!!” It says, “Initiating anesthetics” before spraying her abdomen with a bile-yellow local anesthetic. It then says, “Commence surgical procedure.” (A note for the grammar nerds here: Wouldn’t you expect a machine to maintain a single part of speech for consistency? The first, “Initiating…” is a gerund, while the second, “Commence,” is an imperative.) Then, using lasers, the MedPod cuts through tissue until it reaches the foreign body. Given that the lasers can cut organic matter, and that the xenomorph has acid for blood, you have to hand it to the precision of this device. One slip could have burned a hole right through her spine. Fortunately it has a feather-light touch. Reaching in with a speculum-like device, it removes the squid-like alien in its amniotic sac.

OK. Here I have to return to the whole “ManPod” thing. Wouldn’t a scan have shown that this was, in fact, a woman? Why wouldn’t it stop the procedure if it really couldn’t handle working on the fairer sex? Should it have paused to have her sign away insurance rights? Could it really mistake her womb for a stomach? Wouldn’t it, believing her to be a man, presume the whole womb to be a foreign body and try to perform a hysterectomy rather than a delicate caesarian? ManPod, indeed.

Prometheus-265

After removing the alien, it waits around 10 seconds, showing it to her and letting her yank its umbilical cord, before she presses a few controls. The MedPod seals her up again with staples and opens the cover to let her sit up.

She gets off the table, rushes to the side of the MedPod, and places all five fingertips of her right hand on it, quickly twisting her hand clockwise. The interface changes to a red warning screen labeled “DECONTAMINATE.” She taps this to confirm and shouts, “Come on!” (Her vocal instruction does not feel like a formal part of the procedure and the machine does not respond differently.) To decontaminate, the pod seals up and a white mist fills the space.

OK. Since this is a MedPod, and it has something called a decontamination procedure, shouldn’t it actually test to see whether the decontamination worked? The user here has enacted emergency decontamination procedures, so it’s safe to say that this is a plague-level contagion. That’s doesn’t say to me: Spray it with a can of Raid and hope for the best. It says, “Kill it with fire.” We just saw, 10 seconds ago, that the MedPod can do a detailed, alien-detecting scan of its contents, so why on LV-223 would it not check to see if the kill-it-now-for-God’s-sake procedure had actually worked, and warn everyone within earshot that it hadn’t? Because someone needs to take additional measures to protect the ship, and take them, stat. But no, MedPod tucks the contamination under a white misty blanket, smiles, waves, and says, “OK, that’s taken care of! Thank you! Good day! Move along!”

For all of the goofiness that is this device, I’ll commend it for two things. The first is for pushing the notion forward of automated medicine. Yes, in this day and age, it’s kind of terrifying to imagine devices handling something as vital as life-saving surgery, but people in the future will likely find it terrifying that today we’d rather trust an error prone, bull-in-a-china-shop human to the task. And, after all, the characters have entrusted their lives to an android while they were in hypersleep for two years, so clearly that’s a thing they do.

Second, the gestural control to access the decontamination is well considered. It is a large gesture, requiring no great finesse on the part of the operator to find and press a sequence of keys, and one that is easy to execute quickly and in a panic. I’m absolutely not sure what percentage of procedures need the back-up safety of a kill-everything-inside mode, but presuming one is ever needed, this is a fine gesture to initiate that procedure. In fact, it could have been used in other interfaces around the ship, as we’ll see later with the escape pod interface.

I have the sense that in the original script, Shaw had to do what only a few very bad-ass people have been willing to do: perform life-saving surgery on themselves in the direst circumstances. Yes, it’s a bit of a stretch since she’s primarily an anthropologist and astronomer in the story, but give a girl a scalpel, hardcore anesthetics, and an alien embryo, and I’m sure she’ll figure out what to do. But pushing this bad-assery off to an automated device, loaded with constraints, ruins the moment and changes the scene from potentially awesome to just awful.

Given the inexplicable man-only settings, requiring a desperate patient to recall FORTRAN-esque syntax for spoken instructions, and the failure to provide any feedback about the destruction of an extinction-level pathogen, we must admit that the MedPod belongs squarely in the realm of goofy narrative technology and nowhere near the real world as a model of good interaction design.

VP language instructor

During David’s two year journey, part of his time is spent “deconstructing dozens of ancient languages to their roots.” We see one scene illustrating a pronunciation part of this study early in the film. As he’s eating, he sees a volumetric display of a cuboid appear high in the air opposite his seat at the table. The cuboid is filled with a cyan glow in which a “talking head” instructor takes up most of the space. In the left is a column of five still images of other artificial intelligent instructors. Each image has two vertical sliders on the left, but the meaning of these sliders is not made clear. In the upper right is an obscure diagram that looks a little like a constellation with some inscrutable text below it.

On the right side of the cuboid projection, we see some other information in a pinks, blues, and cyans. This information appears to be text, bar charts, and line graphs. This information is not immediately usable to the learner, so perhaps it is material about the entire course, for when the lessons are paused: Notes about the progress towards a learning goal, advice for further study, or next steps. Presuming this is a general-purpose interface rather than a custom one made just for David, this information could be the student’s progress notes for an attending human instructor.

We enter the scene with the AI saying, “…Whilst this manner of articulation is attested in Indo-European descendants as a purely paralinguistic form, it is phonemic in the ancestral form dating back five millennia or more. Now let’s attempt Schleicher’s Fable. Repeat after me.”

In the lower part of the image is a waveform of the current phrase being studied. In the lower right is the written text of the phrase being studied, in what looks like a simplified phoenetic alphabet. As the instructor speaks this fable, each word is hilighted in the written form. When he is done, he prompts David to repeat it.

akʷunsəz dadkta,
hwælna nahast
təm ghεrmha
vagam ugεntha,

After David repeats it, the AI instructor smiles, nods, and looks pleased. He praises David’s pronunciation as “Perfect.”

This call and response seems par for modern methods of language learning software, even down to “listening” and providing feedback. Learning and studying a language is ultimately far more complicated than this, but it would be difficult to show much more of it in such a short scene. The main novelty that this interface brings to the notion of language acquisition seems to be the volumetric display and the hint of real-time progress notes.

Bridge VP: Hello

The main interface on the bridge is the volumetric projection display. This device takes up the center of the bridge and is the size of a long billiards table. It serves multiple purposes for the crew. The first is to display the “Golden Record” message.

Hello, Deadly World

Prometheus broadcasts a message to LV223 in advance of its arrival that appears to be something like the Voyager Golden Record recording. David checks on this message frequently in transit to see if there is a response. To do so he stands in a semi-circular recess and turns a knob on the waist-high control panel there counter clockwise. It’s reasonable that the potentiometer controls the volume of the display, though we don’t see this explicitly.

The computer responds to being turned on by voice, wishing him “good morning” by name, confirming that it is still transmitting the message (reinforced by a Big Label in the content itself), and informing David that there has been “NO RESPONSE LOGGED.”

The content of the display is lovely. Lines of glowing yellow scaffolding define a cube, roughly a meter on a side. Within is a cacophony of anthropological, encyclopedic information as video and images, including…

  • Masterworks of art such as Da Vinci’s Vitruvian Man, and Lisa Gherardini, wife of Francesco del Giocondo (better known as the Mona Lisa).
  • Portrait of Abraham Lincoln
  • Chemical structures (I could not identify the exact chemicals)
  • Portrait of a young Beethoven
  • The periodic table of elements
  • Mathematical equations
  • A language frequency chart
  • The language-learning A.I. seen elsewhere in the film
  • Musical notation
  • Sonograms of fetuses in utero
  • Video of tribal makeup
  • Video of Noh theater in Japan
  • Video of a young prodigy playing violin in a field

These squares of translucent information are dispersed within the cube semi-randomly. Some display on a sagittal plane. Some on a coronal plane. (None on a transverse plane.) Though to an observer they are greatly overlapped, they do not seem to intersect. Some of these squares remain in place but most slide around along a y- or x-axis, a few even changing direction, in semi-random paths. Two are seen to rotate around their y-axis, and the periodic table is seen to divide into layered columns.

This display quickly imparts to the audience that the broadcast message is complex and rich, telling the vicious, vicious aliens all they need to know about humans prior to their potential contact. But looking at it from a real-world perspective, the shifting information only provides a sense of the things described, which could really work only if you already happened to have existing knowledge of the fundamentals, which the unknown aliens certainly do not have. A better way to build up a sense of understanding was seen in the movie Contact, where one begins with simple abstract concepts that build on one another to eventually form a coherent communication.

In contrast, this display is one of ADHD-like distraction and “sense” rather than one of communication and understanding. But there’s a clue that this isn’t meant to be the actual content at all. Looking closely at the VP, we see that that the language-learning module David uses is present. Look in the image below for the cyan rectangle in the left of the big yellow cube.

Since we know from seeing David use it elsewhere in the film that that module is interactive, and this VP display does not appear to be, we can infer that this is not the actual content being broadcast. This is more like cover art for an album, meant only to give a sense of the actual content to the humans on the “sender” side of the message. In this simple example of apologetics, we see that the complexity that worked for audiences would work equally well for users.

Later in the film we see David turn the display off. Though his hand is offscreen, the click we hear and his shoulder movement seem to indicate that uses the same knob with which he turned it on. After he does so, the display decays in layers common to the movie’s “yellow scaffold” VPs, as a hum slows to a halt.