In the next decade or so, much more sophisticated controls — what the Air Force calls “man-machine interfaces” — could replace the desktops, Patzek tells Danger Room. In addition to the Siri-style two-way voice exchange, Patzek says the next-gen controls could include smarter, easier-to-interpret computer displays and tactile feedback from the drone to the operator, much in the way an Xbox controller vibrates to alert a player he’s taking damage within a game.
Imagine an Air Force drone operator sitting in front of a single, large computer screen elegantly displaying select data from the distant robot in an intuitive graphical format — say, bits of information laid over a hyper-realistic three-dimensional moving picture stitched together from multiple visual and infrared sensors. The operator simply sits and watches until the robot literally asks for advice, perhaps on which suspicious objects — as determined by its sensors and algorithms — to check out more closely.
At that point the human ‘bot-wrangler states his recommendation and the drone swoops down to do its master’s bidding. If the robot detects incoming enemy gunfire, it alerts its boss by causing his chair to shake. The operator can call out, “Evasive action!” and the drone banks sharply.