Making Waves: Diehl’s Quest to Automate the Aircraft Cabin

Share

C³, a new project from Diehl, uses sensors to control cabin functions. Image via Diehl Aviation

Travelers already wave their hands to activate the dryer in the airport washroom. Why not do the same to lower the window shade in the cabin?

Sensors have enabled the automation of numerous everyday actions on the ground, but in the air, passengers and crew still do a lot of things manually, whether it’s opening and closing window shades and switching off lights or checking that the galley is secure for landing.

Until now, it hasn’t really been viable to implement sensors in the aircraft cabin. Lars Fucke, senior specialist, Systems R&T at Diehl Aviation, explains that existing solutions require sensors to be placed locally to each function – which is both expensive and time-consuming to install and maintain.

However, Diehl is hoping to change this with its Context Controlled Cabin (C3). It’s a motion-triggered environment that uses optical sensors to control multiple functions at multiple seats within view, such as dimming a window at one seat while switching on a reading light at another. And in a cabin where “several sensors observing the same area can provide redundancy, updates to functionality can easily be integrated through software,” Fucke adds. The concept has been entered for a Crystal Cabin Award under the Visionary Concepts category.

“The idea is to deduce the intent of passengers by tracking their posture and gestures in relation to certain objects or marked areas in the cabin.” – Lars Fucke, Diehl Aviation

“This is the application of smarthome concepts to the aircraft cabin,” Fucke says. Using remote optical or RFID (radio-frequency identification) sensors, he explains, “The idea is to algorithmically deduce the intent of passengers by tracking their posture and gestures in relation to certain objects or marked areas in the cabin.” In the galley, it would be possible to determine whether doors and latches are closed based on the presence and position of certain elements in the observed area.

Activating certain functions would be a simple matter of making an intuitive gesture, “similar to operating an automatic faucet or soap dispenser,” Fucke says. But in other cases, where it’s less obvious that an object is motion-sensor-based, directions could be provided on tables or surfaces for guidance.

While Fucke confirms the optical sensors will likely be wired, there will also be potential for them to gather user data. “The primary purpose of the sensors is to operate the cabin functions, but we are developing platforms to provide the collected information to potential airlines,” he says, adding that Diehl will begin demonstrating the capabilities of C3 within a year.

“Making Waves” was originally published in the 9.2 April/May issue of APEX Experience magazine.