Cyberspace: Navigation

Cyberspace is usually considered to be a 3D spatial representation of the Internet, an expansion of the successful 2D desktop metaphor. The representation of cyberspace used in books such as Neuromancer and Snow Crash, and by the film Hackers released in the same year, is an abstract cityscape where buildings represent organisations or individual computers, and this what we see in Johnny Mnemonic. How does Johnny navigate through this virtual city?

Gestures and words for flying

Once everything is connected up, Johnny starts his journey with an unfolding gesture. He then points both fingers forward. From his point of view, he is flying through cyberspace. He then holds up both hands to stop.

jm-31-navigation-animated

Both these gestures were commonly used in the prototype VR systems of 1995. They do however conflict with the more common gestures for manipulating objects in volumetric projections that are described in Make It So chapter 5. It will be interesting to see which set of gestures is eventually adopted, or whether they can co-exist.

Later we will see Johnny turn and bank by moving his hands independently.

jm-31-navigation-f

We also see him using voice commands, saying “hold it” to stop forward motion immediately. Later we see him stretch one arm out and bring it back, apparently reversing a recent move.

jm-31-navigation-e

In cyberpunk and related fiction users fly everywhere in cyberspace, a literal interpretation of the spatial metaphor. This is also how users in our real world MUD and MOO cyberspaces start. After a while, travelling through all the intermediate locations between your start and destination gets tedious. MUDs and MOOs allow teleporting, a direct jump to the desired location, and the cyberspace in Johnny Mnemonic has a similar capability.

Gestures for teleporting

Mid sequence, Johnny wants to jump to the Beijing hotel where the upload took place. To do this, he uses a blue geometric shape at the lower left of his view, looking like a high tech, floating tetrahedron. Johnny slowly spins this virtual object using repeated flicking gestures with his left hand, with his ring and middle fingers held together.

jm-31-navigation-2-animated

It looks very similar to the gesture used on a current-day smartphone to flick through a photo album or set of application icon screens. And in this case, it causes a blue globe to float into view (see below).

Johnny grabs this globe and unfolds it into a fullscreen window, using the standard Hollywood two handed “spread” gesture described in Chapter 5 of Make It So.

jm-32-beijing-a-animated

The final world map fills the entire screen. Johnny uses his left hand to enter a number on a HUD style overlay keypad, then taps on the map to indicate China.

jm-32-beijing-c

jm-32-beijing-d

I interpret this as Johnny using the hotel phone number to specify his destination. It would not be unusual for there to be multiple hotels with the same name within a city such as Beijing, but the phone number should be unique. But since Johnny is currently in North America, he must also specify the international dialing code or 2021 equivalent, which he can do just by pointing. And this is a well-designed user interface which accepts not only multimodal input, but in any order, rather than forcing the user to enter the country code first.

Keyboards and similar physical devices often don’t translate well into virtual reality, because tactile feedback is non-existent. Even touch typists need the feeling of the physical keyboard, in particular the slight concavity of the key tops and the orientation bumps on the F and J keys, to keep their fingers aligned. Here though there is just a small grid of virtual numbers which doesn’t require extended typing. Otherwise this is a good design, allowing Johnny to type a precise number and just point to a larger target.

Next

After he taps a location, the zoomrects indicate a transition into a new cyberspace, in this case, Beijing.

6-Screen TV

BttF_109

When Marty Jr. gets home, he approaches the large video display in the living room, which is displaying a cropped image of “The Gold of Their Bodies (Et l’’or de Leur Corps)” by Paul Gauguin. He speaks to the screen, saying “Art off.” After a bit of static, the screen goes black. He then says, “OK, I want channels 18, 24, 63, 109, 87, and the Weather Channel.” As he says each, a sixth of the screen displays the live feed. The number for the channel appears in the upper left corner for a short while before fading. Marty Jr. then sits down to watch the six channels simultaneously.

Voice control. Perfect recognition. No modality. Spot on. It might dynamically update the screen in case he only wanted to watch 2 or 3 channels, but perhaps it is a cheaper system apropos to the McFly household.

Iron Man HUD: A Breakdown

So this is going to take a few posts. You see, the next interface that appears in The Avengers is a video conference between Tony Stark in his Iron Man supersuit and his partner in romance and business, Pepper Potts, about switching Stark Tower from the electrical grid to their independent power source. Here’s what a still from the scene looks like.

Avengers-Iron-Man-Videoconferencing01

So on the surface of this scene, it’s a communications interface.

But that chat exists inside of an interface with a conceptual and interaction framework that has been laid down since the original Iron Man movie in 2008, and built upon with each sequel, one in 2010 and one in 2013. (With rumors aplenty for a fourth one…sometime.)

So to review the video chat, I first have to talk about the whole interface, and that has about 6 hours of prologue occurring across 4 years of cinema informing it. So let’s start, as I do with almost every interface, simply by describing it and its components. Continue reading

Alphy

Barbarella-014

Barbarella’’s onboard conversational computer is named Alphy. He speaks with a polite male voice with a British accent and a slight lisp. The voice seems to be omnidirectional, but confined to the cockpit of the space rocket.

Goals

Alphy’’s primary duties are threefold. First, to obey Barbarella’’s commands, such as waking her up before their approach to Tau Ceti. Second, autopilot navigation. Third, to report statuses, such as describing the chances of safe landing or the atmospheric analysis that assures Barbarella she will be able to breathe.

Display

Alphy

Whenever Alphy is speaking, a display panel at the back of the cockpit moves. The panel stretches from the floor to the ceiling and is about a meter wide. The front of the panel consists of a large array of small rectangular sheets of metal, each of which is attached on one side to one of the horizontal bars that stretch across the panel. As Alphy talks, individual rectangles lift and fall in a stochastic pattern, adding a small metallic clacking to the voice output. A flat yellow light fills the space behind the panel, and the randomly rising and falling rectangles reveal it in mesmerizing patterns.

The light behind Alphy’’s panel can change. As Barbarella is voicing her grave concerns to Dianthus, Alphy turns red. He also flashes red and green during the magnetic disturbances that crash her ship on Tau Ceti. We also see him turn a number of colors after the crash on Tau Ceti, indicating the damage that has been done to him.

In the case of the conversation with Dianthus, there is no real alert state to speak of, so it is conceivable that these colors act something like a mood ring, reflecting Barbarella’’s affective state.

Language

Like many language-capable sci-fi computer systems of the era, Alphy speaks in a stilted fashion. He is given to “computery” turns of phrases, brusque imperatives, and odd, unsocialized responses. For example, when Barbarella wishes Alphy a good night before she goes to sleep, he replies, “Confirmed.”

Barbarella even speaks this way when addressing Alphy sometimes, such as when they risk crashing into Tau Ceti and she must activate the terrascrew and travel underground. As she is piloting manually, she says things like, “Full operational power on all subterranean systems,” “45 degree ascent,” and “Quarter to half for surfacing.”

Barbarella-Alphy-color-04

Nonetheless, Alphy understands Barbarella completely whenever she speaks to him, so the stilted language seems very much like a convention than a limitation.

Anthropomorphism

Despite his lack of linguistic sophistication, he shows a surprising bit of audio anthropomorphism. When suffering through the magnetic disturbances, his voice gets distressed. Alphy’’s tone also gets audibly stressed when he reveals that the Catchman has performed repairs “in reverse,” in each case underscoring the seriousness of the situation. When the space rocket crashes on Tau Ceti, Alphy asks groggily, ““Where are we?” We know this is only affectation because within a few seconds, he is back up to full functioning, reporting happily that they have landed, ““Planet 16 in the system Tau Ceti. Air density oh-point-oh-51. Cool weather with the possibility of stormy precipitations.”” Alphy does not otherwise exhibit emotion. He doesn’t speak of his emotions or use emotional language. This convention, too, is to match Barbarella’s mood and make make her more comfortable.

Agency

Alphy’s sensors seem to be for time, communication technology, self-diagnostics, and for analyzing the immediate environment around the ship. He has actuators to speak, change his display, supply nutrition to Barbarella, and focus power to different systems around the ship, including the emergency systems. He can detect problems, such as the “magnetic disturbance”, and can respond, but has no authority to initiate action. He can only obey Barbarella, as we hear in the following exchange.

Barbarella: What’s happening?
Alphy: Magnetic disturbances.
Barbarella: Magnetic disturbances?…Emergency systems!
Alphy: All emergency systems will now operate.

His real function?

All told, Alphy is very limited in what he can do. His primary functions are reading aloud data that could be dials on a dashboard and flipping switches so Barbarella won’t have to take her hands off of…well, switches…in emergency situations. The bits of anthropomorphic cues he provides to her through the display and language confirm that his primary goal is social, to make Barbarella’s adventurous trips through space not feel so lonely.

Gene Sequence Comparison

Genetic tester

Prometheus-178

Shaw sits at this device speaking instructions aloud as she peers through a microscope. We do not see if the instructions are being manually handled by Ford, or whether the system is responding to her voice input. Ford issues the command “compare it to the gene sample,” the nearby screen displays DNA gel electrophoresis results for the exploding alien sample and a human sample. When Ford says, “overlay,” the results slide on top of each other. A few beats after some screen text and a computerized voice informs them that the system is PROCESSING, (repeated twice) it confirms a DNA MATCH with other screen text read by the same computerized voice.


Prometheus-181

Playback box

When Halloway visits Shaw in her quarters, she uses a small, translucent glass cuboid to show him the comparison. To activate it, she drags a finger quickly across the long, wide surface. That surface illuminates with the data from the genetic tester, including the animation. The emerald green colors of the original have been replaced by cyan, the red has been replaced by magenta, and some of the contextualizing GUI has been omitted, but it is otherwise the same graphic. Other than this activation gesture, no other interactivity is seen with this device.

Prometheus-197

There’s a bit of a mismatch between the gesture she uses for input and the output on the screen. She swipes, but the information fades up. It would be a tighter mapping for Shaw if a swipe on its surface resulted in the information’s sliding in at the same speed, or at least faded up as if she was operating a brightness control. If the fade up was the best transition narratively, another gesture such as a tap might be a better fit for input. Still, the iOS standard for unlocking is to swipe right, so this decision might have been made on the basis of the audience’s familiarity with that interaction.

MedPod

Early in the film, when Shaw sees the MedPod for the first time, she comments to Vickers that, “They only made a dozen of these.” As she caresses its interface in awe, a panel extends as the pod instructs her to “Please verbally state the nature of your injury.”

Prometheus-087

The MedPod is a device for automated, generalized surgical procedures, operable by the patient him- (or her-, kinda, see below) self.

When in the film Shaw realizes that she’s carrying an alien organism in her womb, she breaks free from crewmembers who want to contain her, and makes a staggering beeline for the MedPod.

Once there, she reaches for the extended touchscreen and presses the red EMERGENCY button. Audio output from the pod confirms her selection, “Emergency procedure initiated. Please verbally state the nature of your injury.” Shaw shouts, “I need cesarean!” The machine informs her verbally that, “Error. This MedPod is calibrated for male patients only. It does not offer the procedure you have requested. Please seek medical assistance else–”

Prometheus-237

I’ll pause the action here to address this. What sensors and actuators are this gender-specific? Why can’t it offer gender-neutral alternatives? Sure, some procedures might need anatomical knowledge of particularly gendered organs (say…emergency circumcision?), but given…

  • the massive amounts of biological similarity between the sexes
  • the needs for any medical device to deal with a high degree of biological variability in its subjects anyway
  • most procedures are gender neutral

…this is a ridiculous interface plot device. If Dr. Shaw can issue a few simple system commands that work around this limitation (as she does in this very scene), then the machine could have just done without the stupid error message. (Yes, we get that it’s a mystery why Vickers would have her MedPod calibrated to a man, but really, that’s a throwaway clue.) Gender-specific procedures can’t take up so much room in memory that it was simpler to cut the potential lives it could save in half. You know, rather than outfit it with another hard drive.

Aside from the pointless “tension-building” wrong-gender plot point, there are still interface issues with this step. Why does she need to press the emergency button in the first place? The pod has a voice interface. Why can’t she just shout “Emergency!” or even better, “Help me!” Isn’t that more suited to an emergency situation? Why is a menu of procedures the default main screen? Shouldn’t it be a prompt to speak, and have the menu there for mute people or if silence is called for? And shouldn’t it provide a type-ahead control rather than a multi-facet selection list? OK, back to the action.

Desperate, Shaw presses a button that grants her manual control. She states “Surgery abdominal, penetrating injuries. Foreign body. Initiate.” The screen confirms these selections amongst options on screen. (They read “DIAGNOS, THERAP, SURGICAL, MED REC, SYS/MECH, and EMERGENCY”)

The pod then swings open saying, “Surgical procedure begins,” and tilting itself for easy access. Shaw injects herself with anesthetic and steps into the pod, which seals around her and returns to a horizontal position.

Why does Shaw need to speak in this stilted speech? In a panicked or medical emergency situation, proper computer syntax should be the last thing on a user’s mind. Let the patient shout the information however they need to, like “I’ve got an alien in my abdomen! I need it to be surgically removed now!” We know from the Sonic chapter that the use of natural language triggers an anthropomorphic sense in the user, which imposes some other design constraints to convey the system’s limitations, but in this case, the emergency trumps the needs of affordance subtleties.

Once inside the pod, a transparent display on the inside states that, “EMERGENCY PROC INITIATED.” Shaw makes some touch selections, which runs a diagnostic scan along the length of her body. The terrifying results display for her to see, with the alien body differentiated in magenta to contrast her own tissue, displayed in cyan.

Prometheus-254

Prometheus-260

Shaw shouts, “Get it out!!” It says, “Initiating anesthetics” before spraying her abdomen with a bile-yellow local anesthetic. It then says, “Commence surgical procedure.” (A note for the grammar nerds here: Wouldn’t you expect a machine to maintain a single part of speech for consistency? The first, “Initiating…” is a gerund, while the second, “Commence,” is an imperative.) Then, using lasers, the MedPod cuts through tissue until it reaches the foreign body. Given that the lasers can cut organic matter, and that the xenomorph has acid for blood, you have to hand it to the precision of this device. One slip could have burned a hole right through her spine. Fortunately it has a feather-light touch. Reaching in with a speculum-like device, it removes the squid-like alien in its amniotic sac.

OK. Here I have to return to the whole “ManPod” thing. Wouldn’t a scan have shown that this was, in fact, a woman? Why wouldn’t it stop the procedure if it really couldn’t handle working on the fairer sex? Should it have paused to have her sign away insurance rights? Could it really mistake her womb for a stomach? Wouldn’t it, believing her to be a man, presume the whole womb to be a foreign body and try to perform a hysterectomy rather than a delicate caesarian? ManPod, indeed.

Prometheus-265

After removing the alien, it waits around 10 seconds, showing it to her and letting her yank its umbilical cord, before she presses a few controls. The MedPod seals her up again with staples and opens the cover to let her sit up.

She gets off the table, rushes to the side of the MedPod, and places all five fingertips of her right hand on it, quickly twisting her hand clockwise. The interface changes to a red warning screen labeled “DECONTAMINATE.” She taps this to confirm and shouts, “Come on!” (Her vocal instruction does not feel like a formal part of the procedure and the machine does not respond differently.) To decontaminate, the pod seals up and a white mist fills the space.

OK. Since this is a MedPod, and it has something called a decontamination procedure, shouldn’t it actually test to see whether the decontamination worked? The user here has enacted emergency decontamination procedures, so it’s safe to say that this is a plague-level contagion. That’s doesn’t say to me: Spray it with a can of Raid and hope for the best. It says, “Kill it with fire.” We just saw, 10 seconds ago, that the MedPod can do a detailed, alien-detecting scan of its contents, so why on LV-223 would it not check to see if the kill-it-now-for-God’s-sake procedure had actually worked, and warn everyone within earshot that it hadn’t? Because someone needs to take additional measures to protect the ship, and take them, stat. But no, MedPod tucks the contamination under a white misty blanket, smiles, waves, and says, “OK, that’s taken care of! Thank you! Good day! Move along!”

For all of the goofiness that is this device, I’ll commend it for two things. The first is for pushing the notion forward of automated medicine. Yes, in this day and age, it’s kind of terrifying to imagine devices handling something as vital as life-saving surgery, but people in the future will likely find it terrifying that today we’d rather trust an error prone, bull-in-a-china-shop human to the task. And, after all, the characters have entrusted their lives to an android while they were in hypersleep for two years, so clearly that’s a thing they do.

Second, the gestural control to access the decontamination is well considered. It is a large gesture, requiring no great finesse on the part of the operator to find and press a sequence of keys, and one that is easy to execute quickly and in a panic. I’m absolutely not sure what percentage of procedures need the back-up safety of a kill-everything-inside mode, but presuming one is ever needed, this is a fine gesture to initiate that procedure. In fact, it could have been used in other interfaces around the ship, as we’ll see later with the escape pod interface.

I have the sense that in the original script, Shaw had to do what only a few very bad-ass people have been willing to do: perform life-saving surgery on themselves in the direst circumstances. Yes, it’s a bit of a stretch since she’s primarily an anthropologist and astronomer in the story, but give a girl a scalpel, hardcore anesthetics, and an alien embryo, and I’m sure she’ll figure out what to do. But pushing this bad-assery off to an automated device, loaded with constraints, ruins the moment and changes the scene from potentially awesome to just awful.

Given the inexplicable man-only settings, requiring a desperate patient to recall FORTRAN-esque syntax for spoken instructions, and the failure to provide any feedback about the destruction of an extinction-level pathogen, we must admit that the MedPod belongs squarely in the realm of goofy narrative technology and nowhere near the real world as a model of good interaction design.