Image: Marcelo Cáceres

APEX Insight: The advancements in self-driving car technology on display at this week’s Consumer Electronics Show in Las Vegas provide a glimpse into the future of in-flight entertainment and aircraft cabin technology.

“How do you turn a functional relationship with your car into something that’s fun and emotional?” asked Bob Carter, Toyota’s senior vice-president of Automotive Operations, during a press conference at the Consumer Electronics Show (CES) yesterday. Carter then unveiled the Toyota Concept-i, along with the company’s artificial intelligence (AI) agent, Yui. The goal, Carter explained, is to forge “a functional relationship with artificial intelligence.” In time, he said, our cars will grow to truly understand us, just as we understand our cars.


Bob Carter, Toyota’s senior vice-president of Automotive Operations, unveiled the company’s Concept-i self-driving care during his keynote address to the Consumer Electronics Show (CES).

The interior of the Concept-i biometrically measures driver intention and emotion, while the exterior connects to the outside world. Dr. Gil Pratt, CEO of the Toyota Research Institute, said that his team’s goal is “creating a car that’s incapable of causing a crash, regardless of the ability of the driver.”

BMW is also focusing on user interface, and is using CES 2017 to unveil its HoloActive Touch concept, a gesture-based control system that lets the driver manipulate their car’s controls in space, as ultrasound provides haptic feedback.


BMW will unveil its HoloActive Touch concept, a gesture-based control system, at CES 2017. Image via BMW Blog

Honda is also expected to debut its own concept car, the NeuV, at CES 2017. The autonomous electric vehicle boasts an “emotion engine” that reacts to what the driver says, does and even feels.


NeuV design sketch. Image via Honda

A Friendlier Plane?

From automatically ordering a glass of water to queuing an IFE playlist, gesture control and predictive AI assistance present numerous potential benefits in the aircraft cabin. When combined with advances in data-powered IFE content curation, a multimedia passenger experience that’s both personalized and reactive becomes easy to envision ­– as does an experience that follows the passenger not only over the course of a single journey, but also throughout their entire relationship with an airline.

Advancements in emotional technology may one day transform the aircraft from mere transportation tool to cabin crewmember. AI, biometric monitoring and haptic gesture control can help airlines save on both cognitive strain and aircraft weight: A lever that can be mimed doesn’t have to be built, and customized gestures can be performed without hunting for specific buttons or switches.

Airplane autopilot systems may have been around since the turn of the 20th century, but the complex interface between pilot and aircraft can also be streamlined with the introduction of emotional technology. A cockpit that understands its pilot’s health and disposition can serve as an active safety mechanism. Redundancy and standards must of course be considered, but such is also the case with autonomous cars, and much progress has already been made in this arena.

Jordan juggles deadlines across various time zones as he writes about travel, culture, entertainment, and technology.