The interfaces aboard the Rodger Young in combat are hard to take seriously. The captain’s interface, for instance, features arrays of wireframe spheres that zoom from the bottom of the screen across horizontal lines to become blinking green squares. The shapes bear only the vaguest resemblance to the plasma bolts, but don’t match what we see out the viewscreen or the general behavior of the bolts at all. But the ridiculousness doesn’t end there.
Holy dreck this fuigetry.
After letting Captain Deladier know what’s up with the giant asteroid looming spinning ever closer, Barcalow’s attention is grabbed by a screen immediately before him. It’s the collision alarm.
Prepare your eyes.
This is the interface equivalent of running around screaming in an Ed Wynn voice while flailing your arms over your head. Sure, it’s clear something’s wrong, but other than that, it’s not helping.
Sure, there’s the buzzing and the giant yellow, all-caps text that blinks COLLISION ALARM. There’s a pea green bar that seems to be indicating steadily decreasing distance or time or something that is running out. Those two are helpful. The rest of this information is a pile of nonsense.
- Blinking? If the pilot has seconds to act, isn’t there a risk that when he glances at the screen for a split second, that he’ll miss something?
- What’s with the blue waves rippling out from the representation of the ship? If it’s a collision, wouldn’t you expect something to be represented as coming toward the ship, and maybe a line describing its path, and a point illustrating point of impact?
- Why do all of the NV need to be labeled as such? Why do they need to blink randomly? How is that useful information?
- How do those numbers link to those labels? Isn’t that asking the navigator to do a lot of visual work in a crisis?
- What does it mean for the ESTIMATED MASS to be changing to zero and suddenly jump again? Because that would better fit a Cthulu alarm, as the physics of the Old World no longer applied. Stell’bsna n’ghft. Y’hah.
- What does it mean for the APPROXIMATE SPEED to start so low, rise to nearly 1000, and fall again? What outside force is acting on this mass? (Or is it a function of the mass changing? Anyone care to do the speculative math?)
- The DISTANCE TO OBJECT does in fact decrease like you might expect it to at the beginning. But then it drops to zero. Shouldn’t they be dieing instead? Oh, but then it jumps again.
- Why is time contained in a single number? Does the Federation use some Metric version of time?
- How can OBJECT TRAJECTORY be a single digit? It’s a multivariate concept.
- Why are there no units? As in, anywhere?
- How could OBJECT BEARING change to zero and then jump back up again just like ESTIMATED…
- …Are you kidding me?
And that’s when I went, frame by frame, and captured the data points. Here they are, visualized as a graph over time. Notice anything?
OK, let me just line those up for you.
I know sci-fi interfaces are often made under time pressure, but it really lets me down when they just copy and paste numbers. Like we won’t, many years later, analyze it frame-by-frame for a blog. Sheesh.
Urgency requires focus
Of course this is a narrative interface, meant to communicate to an audience in about a second that things are very very bad for the Rodger Young. Of course it’s rooted in a mechanical metaphor where dumb, fixed sensors with thresholds would go all calamity when things went pear-shaped.
We don’t live in that world anymore. Urgency requires focus, and when circumstances are dire, yes, the pilot needs to know the direness of the problem, but then they also need to focus on fixing that problem. Urgency interfaces should…
- Get attention, and once they have it ease up on the attention getting, since it becomes a distraction.
- Illustrate the problem, including time-until-anticipated.
- Illustrate what the computer is doing about it (if it’s agentive, which the Rodger Young is clearly not.)
- Illustrate the best options available and provide a means to enact those.
Note that the COLLISION ALERT does two and a half of those. It gets Barcalow’s attention, shows the problem with a label, and a green bar shows time remaining. That’s maybe a tenth of the screen. Then it tries its very, very best to distract the user from that useful information with blinking, semi-random nonsense. Was this thing designed by the bugs?
As David is walking through a ship’s hallway, a great clanging sounds from deep in the ship, as the colored lights high in the walls change suddenly from a purple to a flashing red, and a slight but urgent beeping begins. He glances at a billiards table in an adjacent room, sees the balls and cue sliding, and understands that it wasn’t just him: gravity has definitely changed.
There are questions about what’s going on with the ship that the gravity changed so fast, but our interest must be in the interfaces.
Why did David not expect this? If they’re heading to a planet and the route is known, David should know well in advance. The ship should have told him, especially if the event is going to be one that could potentially topple him. Presuming the ship has sensors to monitor all of this, it should not have come as a surprise.
The warning itself seems mostly well designed, using multiple modes of signal and clear warning signs:
- Change in color from a soft to intense color (They even look like eyes squinting and concentrating in the thumbnails.)
- A shift to red, commonly used for warning or crisis
- Blinking red is a hugely attention-getting visual signal
- Beeping is a auditory signal that is also a common warning signal, and hard to ignore
After David sees these signals, he walks to wall panel and presses a few offscreen buttons which beep back at him and silence the beeping, replacing it with overhead pulses of light that race up and down the hallway. Over the sound system a male voice announces “Attention. Destination threshold.”
Why should David have to go find out what the crisis is at the wall interface? If he had been unable to get to the wall interface, how would he know what happened? Or if it required split-second action, why require of him to waste his time getting there and pressing buttons? In a crisis, the system should let you know what the crisis is quickly and intrusively if it’s a dire crisis in need of remedy. The audio announcement should have happened automatically.
The overhead lights are almost a nice replacement for beeping. It still says, “alert” without the grating annoyance that audio can sometimes be. (There’s still a soft “click” with each shifting light, just not as bad.) But if he’s able to silence the audio at this wall panel, why wasn’t he able to silence the race lights as well? And why do they “race” up and down the hallway rather than just blink? The racing provides an inappropriate sense of motion. Given that this signal is for when the crew is in an unusual and potentially dangerous situation, it would be better to avoid the unhelpful motion cue by simply blinking, or to use the sense of direction they provide to signal to David where he ought to be. A simple option would be to have the hallway lights race continuously in the direction of the bridge, leading the crew to where they would be most effective. Even better is if the ship has locational awareness of individual crew members, then you can cut all overhead illumination by 20% and pulse a light a few feet away in the desired direction between 80 and 100 percent, while darkening the hallway in the opposite direction. Then, as David walks towards the blinking light, the ship can lead him, even around corners, to get him where he needs to be. In a real crisis, this would be an easy and intuitive way to lead people where they’re need to be. It would of course need simple overrides in case the crew knew something about the situation that the ship did not.
After walking through the racing-light hallways, he turns just past the door and into the bridge, where we can see the legend “DESTINATION THRESHOLD” across the pilots HUD. He turns on a light, licks a finger, and presses another button to activate all of the interfaces on the bridge. He walks to the pilot’s panel, presses a button to open the forward viewscreen, observing LV223 with wide-eyed wonder.
This entire sequence seems strange from an interface perspective. We’re going to presume that licking his fingers was just a character tic and not required by the system. But in addition to the fact, raised above, that David seems somewhat surprised by it all, that he should have to open doors and manually turn on lights and interfaces during a crisis seems pointless. It’s either not a crisis and these signals should diminish, or it is a crisis and more of this technology should be automated.
Joh is the civil and capital leader of Metropolis, and his large office reflects it in the amount of technology it has. To the left of the door is Josaphat’s work interface (see Middle Class Oppression for more detail). To the right are two other pieces of technology: a large video screen hangs high, and a video phone rests on the wall below.
Joh Frederson paces in his office.
His desk also features some impressive technology. He has a bell jar ticker machine for receiving information. A large output panel on the right side of his desk allows people to request his attention. It features a huge array of thin bulbs labeled with particular codes. In one scene, Joh hears a sound and lifts his head to see a blinking light next to one of the labels. In response he touches a button on a control panel on the left of his desk to close the curtains, and then another to open the door to his office and receive Josephat.
Joh notices that Josephat wishes to speak to him.
Joh closes the curtains from his desk.
Later he uses another button on this same panel to summon his agent, called the Thin Man.
Joh closes the curtains from his desk.
These interfaces are particular to Joh, conveniences only available to one in a position of wealth and power.
Laborers of the Upper City have their own machines to worry about.
Josaphat feels stress while monitoring figures.
One of Johs assistants, Josaphat, has a similarly difficult task. He stands at a tall panel where lit symbols fall quickly and randomly down one of six tall, thin screens. He has to transcribe them (and possibly perform calculations with them) in one of three different books.
Josaphat turns his station off.
The only sensible part of this setup is the mechanism for shutting it down. Given the time pressure its operator is under, it could be disastrous if a single switch was accidentally touched. Instead, to turn it off Josephat must stretch his arms to reach two distant buttons. Touching them both shuts down the station. This seems especially prescient when considering that similar constraints informed the design of the familiar CTRL-ALT-DELETE key sequence for Windows operating systems.