How AI is used in filmmaking

2022-05-14 00:48:13 By : Ms. SCD Cassie

In 1975, George Lucas founded the visual effects company Industrial Light & Magic (ILM). Two years later, he used a computer-controlled camera system to shoot the opera film, Star Wars: A New Hope. The ILM built the system and coupled it with custom-built processors to replicate camera movements. Lucas said the system had “a very powerful impact on the storytelling” and gave him “creative freedom”. However, compared to today’s sophisticated tech, Lucas’ AI system was pretty basic. Marvel villain Thanos’ origin story is a good case in point. The tech team put tracking dots on Josh Brolin’s face to capture his expressions. The data was then used to train the AI model, and voila! Thanos was born.

Today, AI is used in all stages of film production. Let’s zoom in.

In 2016, AI wrote the script for a 10-minute short film, Sunspring. The model was trained on scripts from the 1980s and 1990s. The AI, called Benjamin, used a recurrent neural network to generate the script. The movie, starring Silicon Valley fame Thomas Middleditch, made the top ten at the Sci-Fi London film festival. Since then, the same AI model created two more films. 

In 2019, comedian and writer Keaton Patti used an AI bot to generate a batman movie script. He posted the first page of the hilarious script and it went viral.

I forced a bot to watch over 1,000 hours of Batman movies and then asked it to write a Batman movie of its own. Here is the first page. pic.twitter.com/xrgvgAyv1L

In 2020, Calamity AI used Shortly Read, an AI tool built on GPT-3, to write the screenplay for a three-and-half-minute short film. 

The rapid advances in NLP and large language models have allowed filmmakers to crack the scriptwriting process. However, we are yet to see a full-length feature entirely written by AI.

Filmmakers today use CGI to bring dead actors back to life on screen. For example, two beloved Star Wars characters, Carrie Fisher (Princess Leia) and Peter Cushing (Grand Moff Tarkin) were recreated in Rogue One (the makers used CGI to make the actors look exactly like in the 1977 Star Wars: A New Hope). Carrie Fisher died before completing her scenes for Episode 9: The Last Jedi and CGI was used to complete her story. In the Fast and Furious franchise, the late Paul Walker was virtually recreated to finish his scenes.

Cinelytic is an AI-based startup that assists studios, and independent film companies make faster and smarter decisions through the film’s value chain. Warner Bros has partnered with Cinelytic to implement an AI-based project management system. The platform provides analytics services, scheduling, financial modelling etc.

Belgium-based ScriptBook offers an AI-based script analysis and financial forecasts tool that analyses scenes and recommends whether or not to promote them. ScriptBook has built an AI algorithm to standardise and automate the process. However, the company said its algorithm is not a replacement for a decision-maker. ScriptBook gives a detailed break-up of the scenes including the genre, age restrictions, MPAA (Motion Picture Association of America) rating, and related films. The platform provides scene analysis, the character’s attractiveness score, the emotions evoked by the scenes, the results of the gender equality measurement based on the Bechdel test, etc.

Vault’s RealDemand AI platform analyses thousands of key elements of the story, outline, script, castings, and trailer to maximise ROI 18 months before a film’s release by factoring in release date, country, audience age etc.

“Once you overcome the 1-inch tall barrier of subtitles, you will be introduced to so many more amazing films,” said Korean auteur Bong Joon Ho, at the 2019 Oscars. The film industry is leveraging the advancements in universal translation to increase the reach of movies. 

For example, Star Wars has been translated into more than 50 languages to date. However, you still need humans in the loop to ensure the subtitles are accurate. 

Researchers from the University of Edinburgh created an AI model based on a pair of neural networks that can generate an engaging trailer. The team used the system to generate more than 40 trailers for existing films, and Amazon Turk employees preferred the AI-generated trailers over the official ones.

“To create trailers automatically, we need to perform low-level tasks such as person identification, action recognition, and sentiment prediction, as well as higher-level tasks such as understanding connections between events and their causality, as well as drawing inferences about the characters and their actions,” according to the paper.

The researchers combined two neural networks. The first examines the film’s video and audio to identify scenes of interest. And the second is essentially the judge of what is interesting. It watches a textualised version of the film, similar to a screenplay, and uses natural language processing to identify significant and emotional moments. Based on how the neural networks process the input data, the completed model generates novel trailers using “movie understanding.”

Webinar Speed up deep learning inference 13th May

Conference, in-person (Bangalore) MachineCon 2022 24th Jun

Conference, Virtual Deep Learning DevCon 2022 30th Jul

Conference, in-person (Bangalore) Cypher 2022 21-23rd Sep

Stay Connected with a larger ecosystem of data science and ML Professionals

Discover special offers, top stories, upcoming events, and more.

The basic tenet that Gato followed was to train using the widest range of data possible, including modalities like images, text, button presses, joint torques and other actions based on the context.

IISc plans to bring the Indian pursuit in this field on par with the rest of the world, with a dedicated and focused effort.

AIIMS Jodhpur will also deliver mixed reality enabled remote healthcare services in the district of Sirohi to strengthen medical facilities delivered to underserved locations.

The new Gaudi2 and Greco processors are purpose-built for AI deep learning applications, implemented in 7-nanometer technology and manufactured on Habana’s high-efficiency architecture.

Protected Computing will allow users to remove personally identifiable information from Google Search results.

The summit will feature talks, workshops, paper presentations, exhibitions and hackathons.

Curriculum learning is also a type of machine learning that trains the model in such a way that humans get trained using their education system

Google informs that AlloyDB for PostgreSQL was built on the principle of disaggregation of compute and storage and designed to leverage disaggregation at every layer of the stack.

The statistical features of a time series could be made stationary by differencing method.

This is the first institutional round for USEReady.

Stay up to date with our latest news, receive exclusive deals, and more.

© Analytics India Magazine Pvt Ltd 2022