Putting the AI in cinema: MIT’s filmmaking hackathon

A collage of screenshots from films that use artificial intelligence to assist with production

Space dogs, the perfect scoop of ice cream, and the Infinite Corridor headlined a night of AI-generated artistry during MIT’s first-ever “AI for Film Making Hackathon” this past January. During the event, creatives and coders showcased their short films, teaming up to utilize artificial intelligence in each video.

“Everyone has a tale to tell, and with the advancements in AI technology, individuals now have the means to bring their dreams to life,” says MIT Department of Electrical Engineering and Computer Science (EECS) and CSAIL PhD student Shangdi Yu. She helped organize the hackathon as a co-founder of the MIT Film Makers Association, aiming to “bring together enthusiasts in film-making and storytelling to explore new ways of visualizing and bringing people's dreams to life, as well as provide a platform for people to showcase their imaginative and unique ideas."

Yu was one of six event organizers from the MIT Film Makers Association, each from different MIT departments: Ruihan Zhang of the Media Lab, Yiming Zhang of Biological Engineering, Zhibo Chen of Aeronautics and Astronautics, Songchen Tan of Computational Science and Engineering, and Honghao Cao of EECS Research Laboratory of Electronics. 

The collective tasked participants with a unique challenge: produce a one-to-two-minute video within nine hours that incorporate AI technologies into the filmmaking process. Using state-of-the-art programs like Stable Diffusion, Midjourney, EbSynth, and ChatGPT, each production team yielded impressive returns on cinematic projects based on the theme of telling content creators’ dreams.

“DOG: Dream Of Galaxy” won best overall film, patching together colorful, AI-generated moving images to tell the story of a dog selected to travel to space. During the video, an anonymous man poses a thought-provoking question to the canine cadet: “Do you want to dream a different dream?” Speaking of dreams, “The Scoop” follows a team of researchers on a mission to create the perfect scoop of ice cream using liquid nitrogen. The film features CSAIL rockstar mini-cheetah, who walked and bounced around the hallways in a stellar cameo.

Plenty of stars are visible across the galaxy of “Dreams Across the Stars,” as well as aliens, dancing spacemen, and other out-of-this-world visuals generated by DALL-E and Midjourney. The hypnotic “Infinity Corridor” is similarly eye-catching while also including space imagery. The hackathon’s most-viewed video is “Once upon a spacetime,” which neatly fuses elements of sci-fi westerns and dreamlike AI art.

The contest’s judges included decorated media experts: television producer and director Greg Daniels, whose work on shows like Saturday Night Live and The Simpsons has earned him five Emmy awards, and film editor, television director, and producer David Rogers, who has won two Emmys. MIT Media Lab researcher Pat Pataranutaporn, Brown University Assistant Professor in AI James Tompkin, production designer and set decorator Rachel O'Toole, and Ben Relles, Office of Reid Hoffman content strategist and former YouTube Head of Comedy, rounded out the panel. Judges evaluated the films based on their creative execution and implementation of AI techniques that utilize built-in software or facilitate artistry, cost-effectiveness, and content creation.

Ruihan Zhang noted that the group was inspired to hold the hackathon after taking CSAIL principal investigator Vincent Sitzmann’s course, “Machine Learning for Inverse Graphics” where the students learned about neural radiance fields. The machine learning algorithm can predict 3D scenes given a single 2D image, making it useful throughout the filmmaking process.

Just as organizers suspected from their initial idea, the AI hackathon showcased the creativity and resourcefulness of MIT students. With this in mind, the MIT Film Makers Association plans to hold the event again next year.