Otto’s Manual Control

WallE-Otto02

WallE-Otto06

When it refused to give up authority, the Captain wrested control of the Axiom from the artificial intelligence autopilot, Otto. Otto’s body is the helm wheel of the ship and fights back against the Captain. Otto wants to fulfil BNL’s orders to keep the ship in space. As they fight, the Captain dislodges a cover panel for Otto’s off-switch. When the captain sees the switch, he immediately realizes that he can regain control of the ship by deactivating Otto. After fighting his way to the switch and flipping it, Otto deactivates and reverts to a manual control interface for the ship.

The panel of buttons showing Otto’s current status next to the on/off switch deactivates half its lights when the Captain switches over to manual. The dimmed icons are indicating which systems are now offline. Effortlessly, the captain then returns the ship to its proper flight path with a quick turn of the controls.

One interesting note is the similarity between Otto’s stalk control keypad, and the keypad on the Eve Pod. Both have the circular button in the middle, with blue buttons in a semi-radial pattern around it. Given the Eve Pod’s interface, this should also be a series of start-up buttons or option commands. The main difference here is that they are all lit, where the Eve Pod’s buttons were dim until hit. Since every other interface on the Axiom glows when in use, it looks like all of Otto’s commands and autopilot options are active when the Captain deactivates him.

A hint of practicality…

The panel is in a place that is accessible and would be easily located by service crew or trained operators. Given that the Axiom is a spaceship, the systems on board are probably heavily regulated and redundant. However, the panel isn’t easily visible thanks to specific decisions by BNL. This system makes sense for a company that doesn’t think people need or want to deal with this kind of thing on their own.

Once the panel is open, the operator has a clear view of which systems are on, and which are off. The major downside to this keypad (like the Eve Pod) is that the coding of the information is obscure. These cryptic buttons would only be understandable for a highly trained operator/programmer/setup technician for the system. Given the current state of the Axiom, unless the crew were to check the autopilot manual, it is likely that no one on board the ship knows what those buttons mean anymore.

WallE-Otto03

Thankfully, the most important button is in clear English. We know English is important to BNL because it is the language of the ship and the language seen being taught to the new children on board. Anyone who had an issue with the autopilot system and could locate the button, would know which button press would turn Otto off (as we then see the Captain immediately do).

Considering that Buy-N-Large’s mission is to create robots to fill humans’ every need, saving them from every tedious or unenjoyable job (garbage collecting, long-distance transportation, complex integrated systems, sports), it was both interesting and reassuring to see that there are manual over-rides on their mission-critical equipment.

…But hidden

The opposite situation could get a little tricky though. If the ship was in manual mode, with the door closed, and no qualified or trained personnel on the bridge, it would be incredibly difficult for them to figure out how to physically turn the ship back to auto-pilot. A hidden emergency control is useless in an emergency.

Hopefully, considering the heavy use of voice recognition on the ship, there is a way for the ship to recognize an emergency situation and quickly take control. We know this is possible because we see the ship completely take over and run through a Code Green procedure to analyze whether Eve had actually returned a plant from Earth. In that instance, the ship only required a short, confused grunt from the Captain to initiate a very complex procedure.

Security isn’t an issue here because we already know that the Axiom screens visitors to the bridge (the Gatekeeper). By tracking who is entering the bridge using the Axiom’s current systems, the ship would know who is and isn’t allowed to activate certain commands. The Gatekeeper would either already have this information coded in, or be able to activate it when he allowed people into the bridge.

For very critical emergencies, a system that could recognize a spoken ‘off’ command from senior staff or trained technicians on the Axiom would be ideal.

Anti-interaction as Standard Operating Procedure

WallE-Otto05

The hidden door, and the obscure hard-wired off button continue the mission of Buy-N-Large: to encourage citizens to give up control for comfort, and make it difficult to undo that decision. Seeing as how the citizens are more than happy to give up that control at first, it looks like profitable assumption for Buy-N-Large, at least in the short term. In the long term we can take comfort that the human spirit–aided by an adorable little robot–will prevail.

So for BNL’s goals, this interface is fairly well designed. But for the real world, you would want some sort of graceful degradation that would enable qualified people to easily take control in an emergency. Even the most highly trained technicians appreciate clearly labeled controls and overrides so that they can deal directly with the problem at hand rather than fighting with the interface.

The answer does not program

LogansRun224

Logan’s life is changed when he surrenders an ankh found on a particular runner. Instead being asked to identify, the central computer merely stays quiet a long while as it scans the objects. Then its lights shut off, and Logan has a discussion with the computer he has never had before.

The computer asks him to “approach and identify.” The computer gives him, by name, explicit instructions to sit facing the screen. Lights below the seat illuminate. He identifies in this chair by positioning his lifeclock in a recess in the chair’s arm, and a light above him illuminates. Then a conversation ensues between Logan and the computer.

LogansRun113

The computer communicates through a combination of voice and screen, on which it shows blue text and occasional illustrative shapes. The computer’s voice is emotionless and soothing. For the most part it speaks in complete sentences. In contrast, Logan’s responses are stilted and constrained, saying “negative” instead of “no,” and prefacing all questions with the word, “Question,” as in, “Question: What is it?”

On the one hand it’s linguistically sophisticated

Speech recognition and generation would not have a commercially released product for four years after the release of Logan’s Run, but there is an odd inconsistency here even for those unfamiliar with the actual constraints of the technology. The computer is sophisticated enough to generate speech with demonstrative pronouns, referring to the picture of the ankh as “this object” and the label as “that is the name of the object.” It can even communicate with pragmatic meaning. When Logan says,

“Question: Nobody reached renewal,”

…and receives nothing but silence, the computer doesn’t object to the fact that his question is not a question. It infers the most reasonable interpretation, as we see when Logan is cut off during his following objection by the computer’s saying,…

“The question has been answered.”

Despite these linguistic sophistications, it cannot parse anything but the most awkwardly structured inputs? Sadly, this is just an introduction to the silliness that is this interface.

Logan undergoes procedure “03-033,” in which his lifeclock is artificially set to blinking. He is then instructed to become a runner himself and discover where “sanctuary” is. After his adventure in the outside performing the assignment he was forced to accept, he is brought in as a prisoner. The computer traps him in a ring of bars demanding to know the location of sanctuary. Logan reports (correctly) that Santuary doesn’t exist.

LogansRun206

LogansRun205

hologram

On the other hand, it explodes

This freaks the computer out. Seriously. Now, the crazy thing is that the computer actually understands Logan’s answer, because it comments on it. It says, “Unacceptable. The answer does not program [sic].” That means that it’s not a data-type error, as if it got the wrong kind of input. No, the thing heard what Logan was saying. It’s just unsatisfied, and the programmer decided that the best response to dissatisfaction was to engage the heretofore unused red and green pixels in the display, randomly delete letters from the text—and explode.That’s right. He decided that in addition to the Dissatisfaction() subroutine calling the FreakOut(Seriously) subroutine, the FreakOut(Seriously) subroutine in its turn calls Explode(Yourself), Release(The Prisoner), and the WhileYoureAtItRuinAllStructuralIntegrityoftheSurroundingArcitecture() subroutines.

LogansRun221

Frankly, if this is the kind of coding that this entire society was built upon, this whole social collapse thing was less deep commentary and really just a matter of computer Darwinism catching up with them.

LogansRun223

LogansRun225

LogansRun226

LogansRun227

Gravity (?) Scan

FifthE-UFT001

The first bit of human technology we see belongs to the Federation of Territories, as a spaceship engages the planet-sized object that is the Ultimate Evil. The interfaces are the screen-based systems that bridge crew use to scan the object and report back to General Staedert so he can make tactical decisions.

FifthE-UFT006

We see very few input mechanisms and very little interaction with the system. The screen includes a large image on the right hand side of the display and smaller detailed bits of information on the left. Inputs include

  • Rows of backlit modal pushbuttons adjacent to red LEDs
  • A few red 7-segment displays
  • An underlit trackball
  • A keyboard
  • An analog, underlit, grease-pencil plotting board.
    (Nine Inch Nails fans may be pleased to find that initialism written near the top.)

The operator of the first of these screens touches one of the pushbuttons to no results. He then scrolls the trackball downward, which scrolls the green text in the middle-left part of the screen as the graphics in the main section resolve from wireframes to photographic renderings of three stars, three planets, and the evil planet in the foreground, in blue.

FifthE-UFT008 FifthE-UFT014 FifthE-UFT010

The main challenge with the system is what the heck is being visualized? Professor Pacoli says in the beginning of the film that, “When the three planets are in eclipse, the black hole, like a door, is open.” This must refer to an unusual, trinary star system. But if that’s the case, the perspective is all wrong on screen.

Plus, the main sphere in the foreground is the evil planet, but it is resolved to a blue-tinted circle before the evil planet actually appears. So is it a measure of gravity and event horizons of the “black hole?” Then why are the others photo-real?

Where is the big red gas giant planet that the ship is currently orbiting? And where is the ship? As we know from racing game interfaces and first-person shooters, having an avatar representation of yourself is useful for orientation, and that’s missing.

And finally, why does the operator need to memorize what “Code 487″ is? That places a burden on his memory that would be better used for other, more human-value things. This is something of a throw-away interface, meant only to show the high-tech nature of the Federated Territories and for an alternate view for the movie’s editor to show, but even still it presents a lot of problems.

Alien Astrometrics

Prometheus-222

When David is exploring the ancient alien navigation interfaces, he surveys a panel, and presses three buttons whose bulbous tops have the appearance of soft-boiled eggs. As he presses them in order, electronic clucks echo in in the cavern. After a beat, one of the eggs flickers, and glows from an internal light. He presses this one, and a seat glides out for a user to sit in. He does so, and a glowing pollen volumetric projection of several aliens appears. The one before David takes a seat in the chair, which repositions itself in the semicircular indentation of the large circular table.

Prometheus-204

The material selection of the egg buttons could not be a better example of affordance. The part that’s meant to be touched looks soft and pliable, smooth and cool to the touch. The part that’s not meant to be touched looks rough, like immovable stone. At a glance, it’s clear what is interactive and what isn’t. Among the egg buttons there are some variations in orientation, size, and even surface texture. It is the bumpy-surfaced one that draws David’s attention to touch first that ultimately activates the seat.

The VP alien picks up and blows a few notes on a simple flute, which brings that seat’s interface fully to life. The eggs glow green and emit green glowing plasma arcs between certain of them. David is able to place his hand in the path of one of the arcs and change its shape as the plasma steers around him, but it does not appear to affect the display. The arcs themselves appear to be a status display, but not a control.

After the alien manipulates these controls for a bit, a massive, cyan volumetric projection appears and fills the chamber. It depicts a fluid node network mapped to the outside of a sphere. Other node network clouds appear floating everywhere in the room along with objects that look like old Bohr models of atoms, but with galaxies at their center. Within the sphere three-dimensional astronomical charts appear. Additionally huge rings appear and surround the main sphere, rotating slowly. After a few inputs from the VP alien at the interface, the whole display reconfigures, putting one of the small orbiting Bohr models at the center, illuminating emerald green lines that point to it and a faint sphere of emerald green lines that surround it. The total effect of this display is beautiful and spectacular, even for David, who is an unfeeling replicant cyborg.

Prometheus-226

At the center of the display, David observes that the green-highlighted sphere is the planet Earth. He reaches out towards it, and it falls to his hand. When it is within reach, he plucks it from its orbit, at which point the green highlights disappear with an electronic glitch sound. He marvels at it for a bit, turning it in his hands, looking at Africa. Then after he opens his hands, the VP Earth gently returns to its rightful position in the display, where it is once again highlighted with emerald, volumetric graphics.

Prometheus-229

Finally, in a blinding flash, the display suddenly quits, leaving David back in the darkness of the abandoned room, with the exception of the small Earth display, which is floating over a small pyramid-shaped protrusion before flickering away.

After the Earth fades, david notices the stasis chambers around the outside of the room. He realizes that what he has just seen (and interacted with) is a memory from one of the aliens still present.

Prometheus-238

Prometheus-239

Hilarious and insightful Youtube poster CinemaSins asks in the video “Everything Wrong with Prometheus in 4 minutes or Less,” “How the f*ck is he holding the memory of a hologram?” Fair question, but not unanswerable. The critique only stands if you presume that the display must be passive and must play uninterrupted like a television show or movie. But it certainly doesn’t have to be that way.

Imagine if this is less like a YouTube video, and more like a playback through a game engine like a holodeck StarCraft. Of course it’s entirely possible to pause the action in the middle of playback and investigate parts of the display, before pressing play again and letting it resume its course. But that playback is a live system. It would be possible to run it afresh from the paused point with changed parameters as well. This sort of interrupt-and-play model would be a fantastic learning tool for sensemaking of 4D information. Want to pause playback of the signing of the Magna Carta and pick up the document to read it? That’s a “learning moment” and one that a system should take advantage of. I’d be surprised if—once such a display were possible—it wouldn’t be the norm.

Starmetheus

The only thing I see that’s missing in the scene is a clear signal about the different state of the playback:

  1. As it happened
  2. Paused for investigation
  3. Playing with new parameters (if it was actually available)

David moves from 1 to 2, but the only change of state is the appearance and disappearance of the green highlight VP graphics around the Earth. This is a signal that could easily be missed, and wasn’t present at the start of the display. Better would be some global change, like a global shift in color to indicate the different state. A separate signal might compare As it Happened with the results of Playing with new parameters, but that’s a speculative requirement of a speculative technology. Best to put it down for now and return to what this interface is: One of the most rich, lovely, and promising examples of sensemaking interactions seen on screen. (See what I did there?)

For more about how VP might be more than a passive playback, see the lesson in Chapter 4 of Make It So, page 84, VP Systems Should Interpret, Not Just Report.

Gene Sequence Comparison

Genetic tester

Prometheus-178

Shaw sits at this device speaking instructions aloud as she peers through a microscope. We do not see if the instructions are being manually handled by Ford, or whether the system is responding to her voice input. Ford issues the command “compare it to the gene sample,” the nearby screen displays DNA gel electrophoresis results for the exploding alien sample and a human sample. When Ford says, “overlay,” the results slide on top of each other. A few beats after some screen text and a computerized voice informs them that the system is PROCESSING, (repeated twice) it confirms a DNA MATCH with other screen text read by the same computerized voice.


Prometheus-181

Playback box

When Halloway visits Shaw in her quarters, she uses a small, translucent glass cuboid to show him the comparison. To activate it, she drags a finger quickly across the long, wide surface. That surface illuminates with the data from the genetic tester, including the animation. The emerald green colors of the original have been replaced by cyan, the red has been replaced by magenta, and some of the contextualizing GUI has been omitted, but it is otherwise the same graphic. Other than this activation gesture, no other interactivity is seen with this device.

Prometheus-197

There’s a bit of a mismatch between the gesture she uses for input and the output on the screen. She swipes, but the information fades up. It would be a tighter mapping for Shaw if a swipe on its surface resulted in the information’s sliding in at the same speed, or at least faded up as if she was operating a brightness control. If the fade up was the best transition narratively, another gesture such as a tap might be a better fit for input. Still, the iOS standard for unlocking is to swipe right, so this decision might have been made on the basis of the audience’s familiarity with that interaction.

Alien head sterilizer

Prometheus-158

In the lab, Shaw and Ford investigate the alien head from the complex. They first seek to sterilize it. Though we don’t see how the process is initiated, after it is, a “dumb waiter” raises the head from some storage space to a glass-walled chamber where it is sprayed with some white mist. A screen displays an animation of waves passing along the surface of the head.

When the mist clears, a screen reads “SAMPLE STERILE. NO CONTAGION PRESENT,” which Ford dutifully repeats even though Shaw has a screen that says the exact same thing. Obscure metrics and graphs fill the edges of the screen.

Prometheus-156

It might have been tempting for the designers to simply supply the analysis, i.e., “no contagion,” but by providing the data from which the analysis derives, the scientists can check and verify the data for themselves, so the combination is well considered.

There are several problems with this sterilization system.

The text of the analysis reads well and unambiguously, but the graphics would be more informative if they indicated their values within clear ranges. As they are, they push the burden of understanding the context of the values onto the scientists’ memories. If this was a very commonplace activity, this might not be much of an issue.

More importantly are the problems with the industrial design. First, this device seems surprisingly head-sized. Wouldn’t a crewmember be the most likely thing they’d have to sterilize? Shouldn’t it be bigger? But moreover, this device is in the wrong place on the ship. If it was infected with an alien pathogen, sterilizing it here is already too late. The pathogen has already spread everywhere between the airlock, the storage space, and on the hands of whoever had to move it between. It would be better if possibly unsterile material could be loaded into a decontamination system outside the ship, and then only once sterilized then pass through to the interior.

MedPod

Early in the film, when Shaw sees the MedPod for the first time, she comments to Vickers that, “They only made a dozen of these.” As she caresses its interface in awe, a panel extends as the pod instructs her to “Please verbally state the nature of your injury.”

Prometheus-087

The MedPod is a device for automated, generalized surgical procedures, operable by the patient him- (or her-, kinda, see below) self.

When in the film Shaw realizes that she’s carrying an alien organism in her womb, she breaks free from crewmembers who want to contain her, and makes a staggering beeline for the MedPod.

Once there, she reaches for the extended touchscreen and presses the red EMERGENCY button. Audio output from the pod confirms her selection, “Emergency procedure initiated. Please verbally state the nature of your injury.” Shaw shouts, “I need cesarean!” The machine informs her verbally that, “Error. This MedPod is calibrated for male patients only. It does not offer the procedure you have requested. Please seek medical assistance else–”

Prometheus-237

I’ll pause the action here to address this. What sensors and actuators are this gender-specific? Why can’t it offer gender-neutral alternatives? Sure, some procedures might need anatomical knowledge of particularly gendered organs (say…emergency circumcision?), but given…

  • the massive amounts of biological similarity between the sexes
  • the needs for any medical device to deal with a high degree of biological variability in its subjects anyway
  • most procedures are gender neutral

…this is a ridiculous interface plot device. If Dr. Shaw can issue a few simple system commands that work around this limitation (as she does in this very scene), then the machine could have just done without the stupid error message. (Yes, we get that it’s a mystery why Vickers would have her MedPod calibrated to a man, but really, that’s a throwaway clue.) Gender-specific procedures can’t take up so much room in memory that it was simpler to cut the potential lives it could save in half. You know, rather than outfit it with another hard drive.

Aside from the pointless “tension-building” wrong-gender plot point, there are still interface issues with this step. Why does she need to press the emergency button in the first place? The pod has a voice interface. Why can’t she just shout “Emergency!” or even better, “Help me!” Isn’t that more suited to an emergency situation? Why is a menu of procedures the default main screen? Shouldn’t it be a prompt to speak, and have the menu there for mute people or if silence is called for? And shouldn’t it provide a type-ahead control rather than a multi-facet selection list? OK, back to the action.

Desperate, Shaw presses a button that grants her manual control. She states “Surgery abdominal, penetrating injuries. Foreign body. Initiate.” The screen confirms these selections amongst options on screen. (They read “DIAGNOS, THERAP, SURGICAL, MED REC, SYS/MECH, and EMERGENCY”)

The pod then swings open saying, “Surgical procedure begins,” and tilting itself for easy access. Shaw injects herself with anesthetic and steps into the pod, which seals around her and returns to a horizontal position.

Why does Shaw need to speak in this stilted speech? In a panicked or medical emergency situation, proper computer syntax should be the last thing on a user’s mind. Let the patient shout the information however they need to, like “I’ve got an alien in my abdomen! I need it to be surgically removed now!” We know from the Sonic chapter that the use of natural language triggers an anthropomorphic sense in the user, which imposes some other design constraints to convey the system’s limitations, but in this case, the emergency trumps the needs of affordance subtleties.

Once inside the pod, a transparent display on the inside states that, “EMERGENCY PROC INITIATED.” Shaw makes some touch selections, which runs a diagnostic scan along the length of her body. The terrifying results display for her to see, with the alien body differentiated in magenta to contrast her own tissue, displayed in cyan.

Prometheus-254

Prometheus-260

Shaw shouts, “Get it out!!” It says, “Initiating anesthetics” before spraying her abdomen with a bile-yellow local anesthetic. It then says, “Commence surgical procedure.” (A note for the grammar nerds here: Wouldn’t you expect a machine to maintain a single part of speech for consistency? The first, “Initiating…” is a gerund, while the second, “Commence,” is an imperative.) Then, using lasers, the MedPod cuts through tissue until it reaches the foreign body. Given that the lasers can cut organic matter, and that the xenomorph has acid for blood, you have to hand it to the precision of this device. One slip could have burned a hole right through her spine. Fortunately it has a feather-light touch. Reaching in with a speculum-like device, it removes the squid-like alien in its amniotic sac.

OK. Here I have to return to the whole “ManPod” thing. Wouldn’t a scan have shown that this was, in fact, a woman? Why wouldn’t it stop the procedure if it really couldn’t handle working on the fairer sex? Should it have paused to have her sign away insurance rights? Could it really mistake her womb for a stomach? Wouldn’t it, believing her to be a man, presume the whole womb to be a foreign body and try to perform a hysterectomy rather than a delicate caesarian? ManPod, indeed.

Prometheus-265

After removing the alien, it waits around 10 seconds, showing it to her and letting her yank its umbilical cord, before she presses a few controls. The MedPod seals her up again with staples and opens the cover to let her sit up.

She gets off the table, rushes to the side of the MedPod, and places all five fingertips of her right hand on it, quickly twisting her hand clockwise. The interface changes to a red warning screen labeled “DECONTAMINATE.” She taps this to confirm and shouts, “Come on!” (Her vocal instruction does not feel like a formal part of the procedure and the machine does not respond differently.) To decontaminate, the pod seals up and a white mist fills the space.

OK. Since this is a MedPod, and it has something called a decontamination procedure, shouldn’t it actually test to see whether the decontamination worked? The user here has enacted emergency decontamination procedures, so it’s safe to say that this is a plague-level contagion. That’s doesn’t say to me: Spray it with a can of Raid and hope for the best. It says, “Kill it with fire.” We just saw, 10 seconds ago, that the MedPod can do a detailed, alien-detecting scan of its contents, so why on LV-223 would it not check to see if the kill-it-now-for-God’s-sake procedure had actually worked, and warn everyone within earshot that it hadn’t? Because someone needs to take additional measures to protect the ship, and take them, stat. But no, MedPod tucks the contamination under a white misty blanket, smiles, waves, and says, “OK, that’s taken care of! Thank you! Good day! Move along!”

For all of the goofiness that is this device, I’ll commend it for two things. The first is for pushing the notion forward of automated medicine. Yes, in this day and age, it’s kind of terrifying to imagine devices handling something as vital as life-saving surgery, but people in the future will likely find it terrifying that today we’d rather trust an error prone, bull-in-a-china-shop human to the task. And, after all, the characters have entrusted their lives to an android while they were in hypersleep for two years, so clearly that’s a thing they do.

Second, the gestural control to access the decontamination is well considered. It is a large gesture, requiring no great finesse on the part of the operator to find and press a sequence of keys, and one that is easy to execute quickly and in a panic. I’m absolutely not sure what percentage of procedures need the back-up safety of a kill-everything-inside mode, but presuming one is ever needed, this is a fine gesture to initiate that procedure. In fact, it could have been used in other interfaces around the ship, as we’ll see later with the escape pod interface.

I have the sense that in the original script, Shaw had to do what only a few very bad-ass people have been willing to do: perform life-saving surgery on themselves in the direst circumstances. Yes, it’s a bit of a stretch since she’s primarily an anthropologist and astronomer in the story, but give a girl a scalpel, hardcore anesthetics, and an alien embryo, and I’m sure she’ll figure out what to do. But pushing this bad-assery off to an automated device, loaded with constraints, ruins the moment and changes the scene from potentially awesome to just awful.

Given the inexplicable man-only settings, requiring a desperate patient to recall FORTRAN-esque syntax for spoken instructions, and the failure to provide any feedback about the destruction of an extinction-level pathogen, we must admit that the MedPod belongs squarely in the realm of goofy narrative technology and nowhere near the real world as a model of good interaction design.

Remote Monitoring

The Prometheus spacesuits feature an outward-facing camera on the chest, which broadcasts its feed back to the ship, where the video it overlaid with the current wearer’s name, and inscrutable iconographic and numerical data along the periphery. The suit also has biometric sensors, continuously sending it’s wearer’s vital signs back to the ship. On the monitoring screen, a waveform in the lower left appears is similar to a EKG, but is far too smooth and regular to be an actual one. It is more like an EKG icon. We only see it change shape or position along its bounding box once, to register that Weyland has died, when it turns to a flat line. This supports its being iconic rather than literal.

Prometheus-109

In addition to the iconic EKG, a red selection rectangle regularly changes across a list in the upper left hand corner of the monitor screens. One of three cyan numbers near the top occasionally changes. Otherwise the peripheral data on these monitoring screens does not change throughout the movie, making it difficult to evaluate its suitability.

The monitoring panel on Prometheus features five of the monitoring feeds gathered on a single translucent screen. One of these feeds has the main focus, being placed in the center and scaled to double the size of the other monitors. How the monitoring crewperson selects which feed to act as the main focus is not apparent.

Prometheus-110

Vickers has a large, curved, wall-sized display on which she’s able to view David’s feed at one point, so these video feeds can be piped to anyone with authority.

Prometheus-203

David is able to turn off the suit camera at one point, which Vickers back on the Prometheus is unable to override. This does not make sense for a standard-issue suit supplied by Weyland, but it is conceivable that David has a special suit or has modified the one provided to him during transit to LV-223.

VP language instructor

During David’s two year journey, part of his time is spent “deconstructing dozens of ancient languages to their roots.” We see one scene illustrating a pronunciation part of this study early in the film. As he’s eating, he sees a volumetric display of a cuboid appear high in the air opposite his seat at the table. The cuboid is filled with a cyan glow in which a “talking head” instructor takes up most of the space. In the left is a column of five still images of other artificial intelligent instructors. Each image has two vertical sliders on the left, but the meaning of these sliders is not made clear. In the upper right is an obscure diagram that looks a little like a constellation with some inscrutable text below it.

On the right side of the cuboid projection, we see some other information in a pinks, blues, and cyans. This information appears to be text, bar charts, and line graphs. This information is not immediately usable to the learner, so perhaps it is material about the entire course, for when the lessons are paused: Notes about the progress towards a learning goal, advice for further study, or next steps. Presuming this is a general-purpose interface rather than a custom one made just for David, this information could be the student’s progress notes for an attending human instructor.

We enter the scene with the AI saying, “…Whilst this manner of articulation is attested in Indo-European descendants as a purely paralinguistic form, it is phonemic in the ancestral form dating back five millennia or more. Now let’s attempt Schleicher’s Fable. Repeat after me.”

In the lower part of the image is a waveform of the current phrase being studied. In the lower right is the written text of the phrase being studied, in what looks like a simplified phoenetic alphabet. As the instructor speaks this fable, each word is hilighted in the written form. When he is done, he prompts David to repeat it.

akʷunsəz dadkta,
hwælna nahast
təm ghεrmha
vagam ugεntha,

After David repeats it, the AI instructor smiles, nods, and looks pleased. He praises David’s pronunciation as “Perfect.”

This call and response seems par for modern methods of language learning software, even down to “listening” and providing feedback. Learning and studying a language is ultimately far more complicated than this, but it would be difficult to show much more of it in such a short scene. The main novelty that this interface brings to the notion of language acquisition seems to be the volumetric display and the hint of real-time progress notes.

HYPSP>S020

In addition to the biometric readout at the foot of the hypersleep chamber, David can also check on the humans in hypersleep via direct visual contact. Though the walls of these chambers are ordinarily opaque, by placing his hand on a corner, David can cause the walls become translucent and the top becomes transparent. This allows him to directly view the body to check on a hypersleeper’s appearance, while maintaining some privacy for the hypersleeper from other straying eyes in the area.

The topmost surface of the chamber also has a translucent interface displaying such information as the sleeper’s name, labeled CRYO SLEEP; something called DREAM STATE, which is numerical (Shaw’s is 0560-09797?); and a few other pieces of inscrutable data.

There are a number of problems with this interface. The translucent interface might be a good idea because it would reduce the time between looking at abstractions of data and looking at a subject. But the data shown on the interface is not clearly biometric. (A solid argument can be made that it would be better to swap the data found on this screen and the data seen in the HYP.SL screen at the head of the bed.) In this case, since the data is not biometric, the overlay might actually occlude important outward signs and is therefore a bit misplaced.

Additionally, since the chambers are situated with their feet towards the wall, the orientation of the typography seems to have poor usability as well. For optimal reading in this portrait orientation, a viewer would have to go as far out of his way as possible in the space. This information should have been laid out along the landscape orientation of the pane, and the information moved to the edge such that an unhindered visual scan of the sleeper is possible.

I was at first confused about a feature of the chambers seen later in the movie, when David is surprised to find evidence that Meredith has exited her chamber and walked to her quarters, long ahead of the others. Why was he surprised? If you were designing a system for a caretaker, wouldn’t you want him to know when something as major as that occurs? Then I realized that Meredith outranks and is resentful of David, so it’s entirely likely that if she could, she would enjoy configuring her personal program to wake her ahead of the others and disable any notification that David would ordinarily receive.