How Prime Video Is Capitalizing on AI to Drive Engagement | Exclusive

Available to WrapPRO members

“Everything we do around content, discovery and personalization is AI-driven and that’s the core of our service,” the streamer’s product VP Adam Gray tells TheWrap

prime-video-homescreen
Prime Video is using AI to pinpoint movies and shows that have similar plot points and character arcs to your favorite picks, making suggested content more relevant to your specific tastes (Photo courtesy of Prime Video)

As Amazon looks to capitalize on the rapid evolution of artificial intelligence, the tech giant has infused the technology in several features for Prime Video, designed to make the platform more efficient, immersive and easier to navigate for its 200 million monthly active users globally, including 115 million in the U.S.

This includes AI-generated summaries of episodes or even seasons by analyzing video segments and subtitles. There’s also an AI-powered pop-up feature for Thursday Night Football that provides insights about players and teams during NFL games. And then of course there’s personalizations, which recommend what viewers should watch next. While these personalizations were previously powered by traditional AI, generative AI has supercharged this initiative.

“We’ve been working in AI for 20+ years. So everything we do around content, discovery and personalization is AI-driven and that’s the core of our service,” Prime Video’s vice president of product Adam Gray told TheWrap in an exclusive interview. “When you’re looking at being the first stop entertainment destination for customers where they bring all their subscriptions in one place, really what you’re offering is a way to easily find the best movies and TV shows and live sports for you across all that.”

According to Gray, the AI-focused improvements are driving engagement on the platform. For example, Prime Video’s new X-Ray Recap feature is the most-used feature to date on the service’s app on Fire TV, the company said.

“We’ve been very happy with engagement across the features and our focus is on if there’s a practical problem we can solve,” Gray added. “So as we launch features, we’ll look and see if they’re being used in that way and then how can we build on it.”

Personalization, Dialogue Boost and Audio Descriptions

One of the major generative AI upgrades to Prime Video is in the form of personalization, with the technology being used to make more relevant content recommendations by helping users find movies and shows with similar plot points and character arcs to their favorite picks. It also groups titles tailored to a user’s interests in “Made for You” collections.

“One of the fundamental things that we’re always trying to solve when it comes to recommendations is trying to understand customers habits and needs and that is very, very complicated. We have streaming history that we get from customers and we derive their taste and habits through that,” Prime Video vice president of technology Girish Bajaj explained.

“There’s also a lot of creative intent that goes into creating content and a lot of different dimensions to it. So understanding the content itself is another area of expertise that we that we have to build,” he continued. “When you understand your customers really well and you understand the content really well, then you can build relationships and start to recommend content that actually caters to your customer.”

prime-video-homescreen
Prime Video is using AI to pinpoint movies and shows that have similar plot points and character arcs to your favorite picks, making suggested content more relevant to your specific tastes (Photo courtesy of Prime Video)

The feature is powered by Amazon BedRock, a fully managed service from Amazon Web Services (AWS) for building and scaling generative AI applications with foundational models.

“We’ve been using traditional AI machine learning models for recommendations for many, many years, and generative AI has given us a new door to make it even more impactful and more meaningful to customers,” Bajaj added. “That’s helped us go deeper on understanding the semantic nature and the attributes of content itself.”

dialogue boost
Dialogue Boost uses AI to analyze a movie or series’ audio, identify points where dialogue may be hard to hear, isolate speech patterns and enhance the audio to help viewers catch every word (Photo courtesy of Prime Video).

Additionally, Prime Video has a dialogue boost feature, which uses AI to analyze a movie or series’ audio, identify points where dialogue may be hard to hear, isolate speech patterns and enhance the audio to help viewers catch every word in languages including English, Spanish, French, Italian, German, Portuguese and Hindi.

It also offers Audio Descriptions, an AI-powered narration tool that describes key visual elements of a video to make content more accessible for blind and visually impaired users. The technology identifies gaps in dialogue to help Prime Video’s production teams build Audio Description scripts faster.

X-Ray Recaps

Bedrock’s models, as well as AI models trained on Amazon SageMaker, are also being used in Prime Video’s recently launched X-Ray Recaps, which analyzes video segments, combined with subtitles or dialogue, to generate detailed descriptions of key events, places, times, and conversations across for specific episodes or even entire seasons across all titles.

“If you were to do this using traditional approaches, the development cost of it would be exponentially high. The generative AI approach of it has dramatically reduced the cost of development and deployment at scale,” Bajaj explained. “Amazon Bedrock really gave us a way to fast track this development and and get it out to customers. And this is something that we know customers want, they use it.”

“To date, when people do these types of recaps, it’s only for the most premium content, because it’s very expensive to do it manually,” Gray noted. “So the idea of doing this across every title in a catalog of our scale is a solution that only happens with the power of AI.”

x-ray recap
Utilizing a combination of Amazon Bedrock models and custom AI models trained on Amazon SageMaker, X-Ray Recaps analyzes various video segments, combined with subtitles or dialogue, to generate detailed descriptions of key events, places, times, and conversations. (Photo courtesy of Prime Video)

Guardrails are also applied to ensure the generation of spoiler-free and concise summaries, which are overseen by Prime Video’s production team.

“As we build these features initially, there’s a lot of manual work to ensure the quality is there as we improve it and as we get much better at the guardrails around the prompts that we use we need less and less over time,” Bajaj said. “You continue to have manual folks involved for audits to make sure the quality continues, but it’s a higher lift initially and then scale comes in as you improve.”

Prime Insights for “Thursday Night Football”

Prime Video’s “Thursday Night Football” is also integrating AI in its presentation of live sports, developed in collaboration with “TNF” producers, engineers, former NFL players, and Prime Video Sports’ AI and Computer Vision team. Prime Insights highlights key players, illuminates hidden aspects of the game and predicts pivotal moments before they happen.

“If you look at the NFL, it was over 25 years ago when they introduced the yellow line for first downs, which really made it so much more accessible. And there’s been very little in-play innovation since until we’ve looked at how we could use AI to change that. So we’re taking a look at how we can make the game more accessible to folks and easier to understand,” Gray explained.  “The basic idea is for someone watching, we want them to be able to have a much more immersive experience, because there’s so much rich detail around what’s happening in the game.”

Defensive Alerts tracks defensive players before the snap and identifies “players of interest” in real time that are likely to rush the quarterback.

“It looks across thousands of data points, the positioning of players and their movements to highlight those for the play that are likely to pressure the quarterback. We put a red orb underneath them,” Gray added. “It’s a great example of starting to be able to watch the game like a quarterback would or an offensive coordinator.”

Prime Video is also rolling out an expansion called Pressure Alerts in the coming weeks, which tracks defenders attacking the offensive backfield during live action and highlights those that are in position to disrupt the play, as well as Coverage Identification, which uses an AI model combined with live player tracking data to identify the defensive scheme, such as man-to-man or zone, for fans in real time before the snap.

Prime Insights TNF
Defensive Alerts enhance the viewing experience by tracking defensive players before the snap, and identifying “players of interest” in real time that are likely to rush the quarterback. (Photo courtesy of Prime Video)

Though the pair declined to reveal what other generative AI-powered features may be ahead, they emphasized that the the streaming service is only just getting started.

“The only way we’re able to do things at a scale, across around discovering content and around a more immersive experience when customers watch content is going to be through AI. So you’re going to see this accelerating,” Gray said. “The teams are really learning how they can move faster and innovate and use it. So it’s really a change in the entire structure of how you build.”

“Literally every team in Prime Video is leaning into AI and generative AI, every single team,” Bajaj added. “So there’s a lot more to come.”

Comments