'Everything, Everywhere' Used AI From This Startup. I Went to its Film Festival
Runway is powering thousands of emerging filmmakers in ways breathtaking, fast-improving and sure to unnerve
As the father of three young kids, I’m always looking for any excuse to get out of the house. So when I received an invitation to the Runway AI Film Festival 2024 last week in Los Angeles, I was thrilled for more than one reason.
Runway, an AI startup that has a suite of software to do everything from text-to-video to more than 20 post-production effects for filmmaking, is perhaps best known for being used by a team of six visual artists enlisted by the directing duo Daniels on Everything Everywhere All At Once. Today it’s used by editors on The Late Show with Stephen Colbert, for example, to edit shots in minutes that would normally take hours.
The five-year-old startup, which was reportedly valued at $1.5 billion last summer, rented out the Orpheum Theater on May 1 to showcase 10, 10-minute short films selected by a jury of directors, filmmakers and AI experts. Among the jurors was Wolfgang Hammer, former CEO of CBS Films. The filmmakers did not have to use Runway’s products, but they had to take advantage of some AI tools in their production. No film was made by just combining a bunch of AI mush together.
This is Runway’s second annual festival but tellingly the first time it’s staged one in L.A. And it was packed . . . with tech bros, some industry veterans — and chatter circulated that Natasha Lyonne was in the crowd (though I did not see her). The vibes were good, the bar was open and the overall mood about AI was far more positive than I’ve seen in other parts of L.A. I ran into someone from a streamer who’s tired of the old Streaming Wars and is excited for the bumpiness of a new change in the industry. Can you imagine?
Yes, an AI film festival is obviously a self-selecting audience, but it was a pretty big self-selecting group at that. If I know anything about Hollywood, it’s that no matter how wild a trend might seem, once it catches on, there’s no stopping it. (If you doubt me, think about how long the lines are for Erewhon’s $20 smoothies.)
A couple of days before the event, I spoke with Runway CEO Cristóbal Valenzuela and asked him who was using his company’s products (which include even video-to-video generative AI and such features as removing backgrounds and expanding images). “Studios are embracing it,” he says, “but adoption is being driven by the new generation [of filmmakers].”
Seeing the festival’s films confirmed that these were highly experimental efforts. Regardless of how you feel about AI, know that they required a huge deal of human involvement. Many consisted of very impressive AI generated/altered images and videos with voiceovers on top. I can’t say for sure whether those voiceovers were AI or not, but they definitely sounded human. Many of the films were quite beautiful.
In this article, you’ll learn about:
The state of AI filmmaking right now
The hit TV animator using AI who spoke at the festival
How films used AI in moving ways to advance story
What I saw in terms of Marvel-worthy visual effects
How Runway imagines the future of AI in filmmaking
What this combo of entertainment and tech thinking means for Hollywood
Why AI could be like “therapy” to creators in the idea stage
What this means for jobs