The Hitmakers: AI in Hollywood to Determine Next Big Movie
StoryFit and Slated are using data and technology to help producers determine what movies will be a big hit at the box office. Let’s dive in to find out more about how AI in Hollywood is making a big difference in the movie scene.
“NOBODY KNOWS ANYTHING.”
Those three words, from Academy Award-winning screenwriter William Goldman, tell you a lot about the entertainment business. Or at least they used to. “Not one person in the entire motion picture field knows for a certainty what’s going to work,” said the writer of All the President’s Men and Butch Cassidy and the Sundance Kid. “Every time out, it’s a guess, and, if you’re lucky, an educated one.”
His words apply to just about all forms of entertainment. The formula for assuring a work of art will achieve commercial success has remained a mystery.
It’s a mystery, however, that could finally be solved by artificial intelligence. Perhaps an algorithm will help produce the next movie you see at the theater or publish the next book you pick out to read.
That’s what a pair of companies are hoping as they use AI and analytics to help the people who create movies, books, and television shows make better decisions about which proposed projects are likely hits.
Slated and StoryFit use machine learning to give producers better information. Both companies work with publishers, studios, networks, and film financiers to inject data-driven decision-making into industries that have long relied on judgment calls and gut feelings.
CREATING BETTER SCRIPTS
Each company offers guidance on scripts in progress, but with slightly different approaches.
Austin-based StoryFit focuses on core narrative elements that have been proven to resonate with audiences. The company looks for more than 100,000 factors within a book or script, then compares those to its catalog of more than 35 years of similar material. The result is a report highlighting the proposed project’s best features, like strong dialogue or relationship building, while pointing out areas that need work, like a character that doesn’t connect to the rest of the story.
By highlighting the unique qualities of ai generated scripts, StoryFit takes some of the risks out of movie executives’ decisions. Filmmakers can then focus on highlighting the best parts of a particular story while minimizing its flaws.
Slated, based in the heart of the film industry in Los Angeles, expands on the traditional “coverage” step of screenplay production to create its own measure of a script’s creative value. Coverage is like a book report for a screenplay. It’s a write-up that includes a summary of the plot and an evaluation of key elements like characterization, storyline, premise, and dialogue. Production companies hire script readers who do coverage; then executives use that to make decisions about screenplays.
By adding a scoring system to the coverage stage, Slated digs deep into 11 specific attributes of each script, including premise, originality, logic, tone, and craft. Three people trained in Slated’s methodology read each script, write a paragraph about each attribute, and score it on a 1-5 scale. Each script emerges with a Script Score along with an overall Pass/Consider/Recommend rating.
Slated President Jason Scoggins created an early version of the Script Score at his previous company and has heard criticism from creative types that their work is being reduced to a number.
“We used to hear that fairly frequently,” he said. “But the product that we’re delivering is not just a number. It’s what amounts to 12 pages of notes from three different readers stating their opinion and defending it. Basically, north of 99 percent of writers go, ‘Yep. While I don’t necessarily agree with every single one of these comments, I can at least respect that it’s someone’s opinion because it’s so completely defended.’”
Monica Landers, CEO of StoryFit, got a similar reaction when the company first began showing its reports to studios. They were defensive when first hearing the idea, but the results changed their minds.
“I did get that pushback before. But I have never heard that after we’ve delivered our information to someone,” Landers said. “We are here to support the process and just do something that they can’t do. This level of technology is not going to exist in a media company right now, and so we just want to deliver something that maybe they can’t do to help support their process.”
FINDING HIDDEN GEMS
Just as technology can help at the beginning of the production process, StoryFit has also found a way to contribute after a work has been released. Landers said they’re drawing a lot of attention from clients in the publishing industry who have huge backlogs of material but no good way of bringing them back to the public’s attention.
It turns out that AI for movie analysis is well-suited to this task. By scanning the text of the story, StoryFit uses natural-language processing to pull out the key elements important to readers. That includes not only genre, plot, and characters, but also nuanced details like time period, settings, and tone. They then filter the resulting list of keywords through Google Trends, Amazon best practices, and even dictionaries to create a marketing guide that publishers can use to promote similar titles.
For example, when the Broadway hit musical Hamilton stormed pop culture in 2015, publishers were sitting on tons of previous books (or even chapters of books) about Alexander Hamilton and the Founding Fathers. But they rarely had the in-house knowledge or a reliable database to locate them all. StoryFit uses metadata technology to find all of these related mentions, giving publishers’ existing material new life.
“What this means is that IP (intellectual property) isn’t dead once it’s published and has been out for a couple of months,” Landers said. Imagine a new paleo diet cookbook being created by pulling recipes from previously published material, or textbooks that ensure all the material matches a particular reading level. “It means you can pull stories out of the past. You can put a new cover on it, and if the IP is fully owned, you can mix and match.”
IMPROVING FINANCIAL DECISIONS
But even the best stories won’t find an audience if there isn’t money behind them. And investing in movies, particularly independent productions, can be a dicey proposition. Scoggins believes Slated has found an effective way to match filmmakers and their projects with film financiers who can push those projects forward.
Slated’s scoring system judges the strength of a movie project based on the track record of the team (Team Score), the quality of the screenplay (Script Score), and the likelihood the project can recoup its production costs (Financial Score).
To test the correlation between Slated’s scores and box office performance, the company put together a portfolio of more than 50 movies, including Hacksaw Ridge, Central Intelligence, Bad Moms, Snowden, and the 2016 remake of The Magnificent Seven. It ran its scoring system against each film before its release and then tracked how each performed at the box office.
In the end, the 13 movies with good Slated scores performed 2.2 times better than the entire portfolio. Even better, the movies in the subset with a budget of $20 million or less did 3.5 times better than the group of 50 projects.
“This is as close as you can get to a guarantee,” Scoggins said. “If you do that, what we’ve seen is that instead of one of our 10 investments being a hit and six out of 10 literally not fully recouping, if you use the Slated system, you get three out of 10 hits instead of one and only two or three projects fully recouping.”
Scoggins believes that, with a little adjustment, Slated’s model could be extended to television projects. Its Team Score, which is based on someone’s track record in the business, translates well, as does the Script Score. The biggest difference comes in the Financial Score section because TV’s revenue model differs from that of the movies.
TV is driven by advertising and subscriptions, so a show’s success really hinges on how many episodes are produced. Slated is working on a data model that would predict the number of episodes a given series is likely to generate so that financiers could reliably forecast how well their investment in a show would pay off.
Expansion is on Landers’ mind at StoryFit as well. She said the company is looking at ways to analyze several additional storytelling formats, including short-form writing, podcasts, even email. Stories aren’t going anywhere, she said, and recognizing the elements of each story that are meaningful to particular audiences will always be valuable, no matter the platform.
“The story is the one element of entertainment that hasn’t changed through the centuries,” Landers said. “Only the medium has changed.”
“Somebody knows something” may not be as catchy as Goldman’s original line about Hollywood’s ability to predict hits. But with the type of data that StoryFit and Slated are producing, at least decision-makers may better understand what stories are likely to succeed.
THE NEW MOVIE CRITICS
Not too long ago, if you were deciding what movie to see, you would check the review from your local film critic or see what got two thumbs up from Siskel and Ebert. But with the rise of review aggregators like Rotten Tomatoes and Metacritic, thousands more thumbs now influence your decision.
Those sites essentially assign a number reflecting the percentage of positive reviews across hundreds of critics. A high score on Metacritic or Rotten Tomatoes’ Tomatometer has become an essential tool for marketing today’s movies.
On the other hand, a bad score could sink an otherwise mediocre movie. “Bad RT scores because greater changes in audience opinion than good RT scores,’ Ben Carlson, co-founder of the social media research firm Fizziology, told Wired. “If you have a score under 30, it has a 300 percent greater impact in the volume of review conversations than scores over 70.”
Women have a long history of being underrepresented in Hollywood, both in front of and behind the camera. StoryFit and Slated each dug into their data to show how stark the inequalities are.
StoryFit analyzed more than 2,000 scripts and 25,000 characters in films dating back to 1930. Over that stretch, male characters spoke 70 percent of the time, even though movies that split the dialogue more evenly between men and women tended to do better at the box office.
The disparities don’t just happen on screen. Slated looked at nearly 1,600 movies released between 2010 and 2015 and found that low-budget films (budgets below $25 million) directed by women were released on one-third as many screens as ones directed by men.
That’s particularly crippling because a successful independent movie can be the gateway to bigger opportunities.