As Vika is looking at the radar and verifying visuals on the dispatched drones with Jack, the symbols for drones 166 and 172 begin flashing red. An alert begins sounding, indicating that the two drones are down.
Vika wants to send Jack to drone 166 first. To do this she sends Jack the drone coordinates by pressing and holding the drone symbol for 166 at which time data coordinates are displayed. She then drags the data coordinates with one finger to the Bubbleship symbol and releases. The coordinates immediately display on Jack’s HUD as a target area showing the direction he needs to go.
Overall, the sequence of interactions for this type of situation is pretty simple and well thought out. Sending coordinates is as simple as:
- Tap and hold on the symbol of the target (in this case the drone) using one finger
- A summary of coordinates data is displayed around the touchpoint (drone symbol)
- Drag data over to the symbol of the receiver (in this case the Bubbleship)
Then on Jack’s side, the position of the coordinates target on his HUD adjusts as he flies toward the drone. Can’t really get much simpler than that.
When Vika initially powers up the desktop, the drone status feed already shows drones 166 and 172 down. This is fine, except the alert sound and blinking icons on the TETVision don’t occur until Jack has already reached the hydro-rigs. This is quite a significant time lag between the drone status feed and the TETVision feed. It would be understandable if there was a slight delay in the alert sound upon startup. An immediate alert sound would likely mean there is something wrong with the TETVision system itself. That said, the TETVision drone icons should at the very least already be blinking red on load.
Monitoring drone 166
As Jack is repairing drone 166, Vika watches the drone status feed on her desktop. The drone status feed is a dedicated screen to the right of the TETVision feed.
It is divided into two main sections, the drone diagnostic matrix to the left and the drone deployment table to the right.
The dispatched drone table lists all drones currently working the security perimeter and lists an overview of information including drone ID, a diagram and operational status. The drone diagnostic matrix shows data such as fuel status and drone positioning along the perimeter as well as a larger detailed diagram of the selected drone.
By looking at the live diagnostics diagram, Vika is able to immediately tell Jack that the central core is off alignment. As soon as Jack finishes repairing the central core, the diagram updates that the core is back in alignment and an alert sound pings.
How does the feed know which drone to focus on?
Since there is no direct interaction with this monitor shown in the film, it is assumed to be an informational display. So, how does the feed know which drone to focus on for diagnostics?
One possibility could be that Jack transmits data from the ground through his mobile drone programmer handset, which is covered in another post. However, a great opportunity for an example of agentive tech would be that when Vika sends the drone coordinates to the Bubbleship, the drone status feed automatically focuses on that one for diagnostics.
Clear messaging in real-time…almost
Overall, the messaging for drone status feed is clear and simple. As seen in the drone deployment table, the dataset for operational drones includes the drone ID number and a rotating view of the drone schematic. If the drone is down, the ID number fades and the drone schematic is replaced with a flashing red message stating that the drone is offline. Yet, when the drone is repaired, the display immediately updates to show that everything is operational again.
This is one of the basic fundamentals of good user interface design. Don’t let the UI get in the way and distract the user.
Keep it simple.