APEX TECH Panel Preview: “AI-Assisted” Smart Metadata in the Content Delivery Supply Chain

Share

Image: ThisisEngineering via Pexels
Photo: ThisisEngineering via Pexels

APEX TECH, which is taking place at the Sheraton Gateway Los Angeles Hotel from 28-29 February, will feature numerous panel discussions concerning the latest technologies in in-flight entertainment and connectivity (IFEC). 

On Thursday 29 February from 09:00 AM until 10:15 AM, delegates will be party to a session on ““AI-Assisted” Smart Metadata in the Content Delivery Supply Chain,” moderated by FabricData CEO Rob Delf. To set the scene before the discussion itself, some of the panelists have outlined how they currently use AI as part of their products and services, and how they hope to do so in the future as the technology becomes more prevalent. 

Delf provides an overview of how smart metadata generated using artificial intelligence (AI) can streamline the workload for the parties involved in the content supply chain:

“Any presentation devices of film and television content (screens, apps, etc.) have different requirements for the title information displayed. Examples of this are different imagery sizes, different lengths of descriptive text, different genre categories, and many others. While there are repositories of information about film and television data that can be licensed, acquired or otherwise scraped together, there is a transformation process that poses some challenges.

“AI provides the opportunity to perform these transformations based upon expected output requirements – in other words, you define the structure that the data needs to conform to, and AI will perform the rest: different languages, different lengths of descriptions, different tones, all kinds of seemingly magical things,” he continues. 

“In-flight viewing habits are different from at-home, so content recommendation needs a tremendous amount of metadata and teaching cases to improve the in-flight viewing experience and maximize customer satisfaction.”

Richard Kroon, EIDR

Andre Valera, VP Business Development at Conetic (the umbrella company that owns Touch Inflight Solutions and ABOVE) – which recently announced a new IFE partnership with Aeromexico – says his companies currently use AI to “analyze individual passenger preferences alongside their viewing history, demographic data and more” to generate smart metadata that enables content personalization. Valera adds that AI also offers multilingual support, “enhancing metadata by recognizing and adapting to multiple languages.”

However, in the medium- to long-term, Valera says AI offers several more exciting opportunities, including “the capability of recognizing passengers’ emotional states through facial expressions or voice analysis, leading to more intuitive content recommendations embedded into the metadata.” He goes on to say that these recommendations could adapt in real-time based on passenger interactions with an IFE platform. 

Finally, Valera posits the notion that with such interactivity – and connectivity – unlocked, “AI-driven metadata that identifies products/experiences within the content that could be linked to e-commerce and programmatic ads could be the holy grail to unlock ancillary revenues.” 

Like Valera, panelist Richard Kroon, Director of Technical Operations at Entertainment Identifier Registry (EIDR), which provides the link between the AI and audiovisual content so parties can transact on what they’ve produced, thinks AI is promising in terms of content recommendations.

“As in-flight catalogs grow so that passengers are sure to find content to their liking, it becomes increasingly hard for viewers to find that content. In-flight viewing habits are different from at-home, so content recommendation needs a tremendous amount of metadata and teaching cases to improve the in-flight viewing experience and maximize customer satisfaction,” he explains. 

“Using AI, we are able to accurately place media player controls without being dependent on such metadata from the content providers.”

Juraj Siska, IdeaNova Technologies

While Stellar Entertainment’s Managing Director Sam Allen echoes that “personalisation is obviously the focus from an airline passenger experience and ancillary revenue perspective,” he also feels “there is the possibility of creating tailored ‘smart’ metadata that aligns with airline KPIs. 

“By leveraging AI, it’s possible to manipulate metadata to reflect an airline’s brand and key messaging, utilizing AI to analyze their content set and consistently include their brand key words in their metadata to subtly promote their messaging, all of which can also positively impact Net Promoter Scores.”

Stellar Entertainment already leverages AI to help with quality control, dynamic updates, tagging/categorization and creating OEM system-specific metadata. “We have an IFE-specific cloud-based content enterprise resource management application and workflow solution; pixL. pixL is designed to drive all of our workflow and digital supply chain processes, including metadata,” states Allen. “pixL has API functionality, which is essential for the services that it provides and connects to, including IMDB for Movie and TV metadata, box office information and IMDB scores. We also leverage this API functionality to connect to AI tools, which we use to enrich the metadata that we have coming in from IMDB, studios and distributors.”

IdeaNova Technologies Co-founder Juraj Siska confirms that his company has already used AI to develop a scene recognition solution to be used by its media players. “It’s an automated solution for detecting movie scenes to allow content-based scene selection,” he describes. “This is currently used to provide more accurate sampling of scenes when scanning through a movie or TV show. Until now, the sample scenes are time-based, which makes for either inaccurate or repetitive selection of scenes.

“We also used this algorithm for identification of time boundaries that separate the movie from beginning and end credits. This time allows us accurate placement of controls passengers can use to skip the credits. Only some content providers provide these time marks today. Using AI, we are able to accurately place these controls without being dependent on such metadata from the content providers.”

Excitingly, Siska confirms that thanks to AI, this scene detection algorithm has other potential uses, such as  “detecting scenes deemed inappropriate by the airlines or detecting the transition of scenes allowing for better placement of mid-roll ads,” for example. 

To delve further into this topic and others, register to attend APEX TECH now.