The Future of Media and Entertainment is … Big Data?

Can AI, IoT and big data analytics turn moviemaking Into a science?

When a scene in a box office comedy prompts a deep laugh, when dialogue in a drama makes your eyes well up, when a stunt in a thriller has you gripping both sides of your chair, that’s good art, no doubt.

But is it science, too? Or is it somehow both?

Science and art are often seen as stuck in their own spheres, with science focused on explaining how the world works, and art exploring what gives it meaning. But there’s a growing movement in the entertainment industry to link the two – and get better box office returns – by using technologies like the Internet of Things (IoT), big data analytics, artificial intelligence (AI) and machine learning.

In fact, researchers are currently using these technologies to analyze the historic catalogue of movies in minute detail, genre by genre, and build out a genome of the most successful films, so it’s easier to re-create their success. With hundreds of millions invested in a single film (e.g. $320 million on the latest Mission Impossible) or billions spent to acquire a franchise like Star Wars ($4.05 billion), there’s a compelling reason to find out what works and make more of it. This new science, researchers say, is the future of the M&E industry.

Of course, it’s an open question as to whether it’s truly possible to make good art from big data analytics. But the mysteries of what makes a good story may not be as mysterious as some think. As this new blend of art and science progresses, it is going to need more and more interconnection to be successful.

What does green mean?

The media and entertainment industry is investing in this type of science now, because it’s never before been possible. In a digital age, volumes of data from decades worth of film can be collected, tagged and sorted, and AI and machine learning can be applied to find patterns in pacing, in dialogue, in tone, in colors, in music, in narrative structure, in any number of variables that can help explain why a given film succeeds well past a conventional explanation like, “It starred ‘The Rock.’”

It’s not just the film itself that can supply clues about what works and what doesn’t. The audience is also being tapped for information. Social media reaction in online communities such as Reddit or Facebook can be analyzed to learn why some scenes soar and others flop. For real-time feedback, IoT sensors in movie theaters, or in people’s homes – right down to the camera on your smartphone – can be accessed to measure reactions to certain moments. Not only can that data be used to guide future projects, but plot lines and outcomes could conceivably be adjusted mid-stream to personalize a show for the audience.

These types of applications are moving fast past the realm of theory. Yves Bergquist, the director of the AI & Neuroscience in Media Project at USC, is developing a “knowledge engine” called Corto, which is built on a massive database that promises users in the entertainment world previously unknowable insights. Bergquist says his dream is to measure cognitively “all aspects of film.” He means everything – color, white balance, edit pace, composition, music, emotional tonality – so he can answer questions about the movies that no one ever asked.

“I want to measure audiences’ cognitive reactions to every single one of these attributes,” Bergquist said during a talk earlier this year. “I want to know what green means. What does it mean? I want to know what purple means to people in specific emotional or cinematic contexts.”

Amid a data-driven deep dive into what makes a successful movie, it’s fair to ask if these efforts drain the creativity and color out of it along the way. Calling a movie or TV show “formulaic” is an epithet in film criticism, and the attempt to analyze a great film like, say, “The Godfather” down to the backlighting in minute 57 may seem like the ingredients for a rampaging monster of a formula that takes movies to new levels of predictability and unwatchability.

Remember, though, that it’s widely agreed that a good story has universal elements that resonate across time and cultures. In fact, Bergquist notes that some of his early analysis shows a uniformity to popular narrative structures that spans geographic regions. Why not use technology to better learn with more precision what the elements of a good story are and how they lead to better movies?

It’s also worth noting that this science doesn’t just apply to would-be blockbusters. Independent filmmakers could use the info collected in that genre to learn what kinds of lighting and colors are common in the most compelling scenes, what equipment was used, what character arcs get the best response. If the information can inspire adjustments that improve a film and lead to, say, 15% more people paying to see it, that can make the difference whether a project succeeds or fails.

Interconnection fuels new science

This emerging media and entertainment industry science is going to function best with, and sometimes completely depend on, superior interconnection – the private data exchange between businesses.

For instance, to offer efficient and substantive insight from the huge stores of data being collected about successful movies, AI needs direct and secure interconnection between data sources and machines, as well as access to cloud applications for various functions (data analytics, storage, etc.). IT architectures can’t be bottled up in corporate data centers, far from users and counterparties. Media and entertainment companies relying on AI need to get close to where data is being exchanged at the digital edge.

It’s the same story for IoT-based functions that would collect data and read and respond to audiences in real-time. Movies theaters everywhere could become network endpoints in this world, as could a web of individual smartphones in living rooms offering viewer data. Again, distance between companies and counterparties can lead to inferior performance and slower results, and interconnection at the digital edge eliminates that weakness.

Equinix’s global interconnection platform, Platform Equinix™, covers 52 markets in five continents. That reach, combined with Interconnection Oriented Architecture™ (IOA™) best practices for deploying IT at the digital edge, can put this developing science for the media and entertainment industry on firm footing. Hopefully, that leads to art that has us all laughing, crying and gripping the sides of our chairs a little more often.

To learn more about how interconnection can help M&E companies transform in a digital age, check out the Content and Digital Media Playbook.

Print Friendly


Related Content