senckađ
Group745
Group745
Group745
Group745
Group745
Group745
Behind the Work in association withThe Immortal Awards
Group745

How AI Was Used to Show Facebook’s Metaverse Future with a Rainforest Rave

06/12/2021
Production Company
Los Angeles, USA
862
Share
Object & Animal director Andrew Thomas Huang, Droga5 and the VFX team at Electric Theatre Collective speak to LBB’s Ben Conway about how they used AI and puppetry to turn a classic painting into a rumble in the jungle for Facebook’s ‘Meta’ rebrand


The word ‘meta’ is the newest and trendiest buzzword across many industries and walks of life - none more so than in Silicon Valley and other global tech hubs. In a surprise move to some, but perhaps not for the tech-heads of you out there who have witnessed the growing interest in the ‘metaverse’, Facebook rebranded itself to ‘Meta’ - with the hopes of becoming the industry leader in developing the ‘metaverse’. 

To accompany this sudden and seismic rebranding of one of the most recognisable and successful tech companies of the internet age, Droga5, Object & Animal and Electric Theatre Collective partnered up with Meta to produce a spot that introduces the audience to what Meta describes as “the next evolution of social connection”.

The spot follows a group of friends in a museum that discover an Henri Rousseau painting (‘Fight Between a Tiger and Buffalo’) that comes to life - inviting them into a “dimension of imagination” with CG and AI-generated flora and fauna that dances to a rave soundtrack. The animal puppets were designed by Sonny Gerasimowicz - the designer of Spike Jonze’s ‘Where the Wild Things Are’ creatures - and brought to life by a combination of greenscreens, procedurally generated CG techniques and AI technology.

LBB’s Ben Conway spoke with Object & Animal director Andrew Thomas Huang, Droga5 ECD Thom Glover and the VFX team at the Electric Theatre Collective (ETC) about how you communicate the concept of the ‘metaverse’ when it’s still being created, working on such a significant rebranding project and making flamingos twerk. 



LBB> Andrew, how early in the process did you get involved? And how involved were you in the creative process with Droga5?

Andrew> The creative team at Droga5 came to me with a brief explaining the narrative about college students who become immersed in the world of Henri Rousseau. I wrote my interpretation of the story by imagining the world of Rousseau brought to life using puppetry and AI Generative Adversarial Networks (GAN) technology.

We worked quite intimately with Droga5 to build the story and the universe together - everything including casting choices, wardrobe and set design, along with the visual effects techniques and animal designs.


LBB> Did Meta have a brief or a strong idea for the direction, or did you have lots of freedom?

Andrew> Meta’s ultimate vision was to introduce a flat two-dimensional world and expand it to an immersive three-dimensional experience in the metaverse. With this thread in mind, and keeping within the language of Rousseau’s paintings, I felt I had a fair amount of creative flexibility within those parameters.


LBB> As this film launches Facebook’s rebranding, does this present opportunities to do something a bit different or unusual? Does it also come with added pressure?

Andrew> The stakes were certainly high with Meta’s first new brand film. I definitely wanted to do something unusual, which is why we chose puppets to integrate with AI and ended up with a unique result.

 

LBB> How do you visually communicate ‘the metaverse’? For many, it’s quite a new and difficult to understand concept still - so how did you represent the concept in the film?

Andrew> We tried to tell a simple story about protagonists starting in one world and stepping into an expanded colourful reality. To tell the story, we focused on the idea of immersion, three-dimensionality and agency in a new space. The metaverse is still being built, so we felt we had the freedom to be imaginative.  


LBB> Thom, from Droga5’s perspective, why was the decision made to use an Henri Rousseau painting? 

Thom> Henri Rousseau’s work is very evocative. It’s got a kind of magical children’s-picture-book quality to it that makes you want to explore the world of it. But it’s also really flat and two-dimensional, with a really strange perspective. The combination of these things made it perfect for an exploration of: ‘What would it be like in 3-D?’


LBB> The puppet/animal design is fantastic - could you talk about Sonny Gerasimowicz and his designs for this spot?

Andrew> Sonny Gerasimowicz designed the characters in Spike Jonze’s Where the Wild Things Are. He was a pleasure to work with, and his designs gave so much life, humour and character to the animals in this universe, combining Rousseau’s sensibilities and his own. I loved working with him.



LBB> Could you talk us through how you brought the jungle and the animals to life? How did Electric Theatre Collective get involved? 

Andrew> I reached out to puppeteer Michelle Zamora, who runs a collective called Viva La Puppet in Los Angeles, to fabricate the animals and bring Sonny’s designs to life. We filmed the puppets on a green screen and then worked with Electric Theatre Collective to design an AI workflow, using Rousseau’s paintings as source information for machine learning. The end results you see are the live-action animals treated with AI technology to render them in the style of Rousseau’s paintings. 

ETC used a Python script to generate source material for the AI aggregate and ‘learn’ what Henri Rousseau’s painting style looked like. As the AI gathered more source material, it was able to take any footage we shot and graft Henri Rousseau’s painterly style onto our footage. We told the computer to ‘make like Rousseau’ in Python script in order for the AI to translate our footage

ETC> The jungle scenes were all created using procedural plant generation tools in Houdini. Many of the original tools were designed to create real-world plant structures, so a lot of the upfront R&D was in styling the output of these systems to deliver recognisably Rousseau-style vegetation. This was about acquainting ourselves with the subject matter and picking out the signature styles from a collection of Rousseau's paintings. Because we were relying heavily on AI to contribute to the final look, beyond the layout and modelling of the jungle, our rendering style was fairly basic, as we found that the AI responded better to simple, recognisable shapes and colours.

Many of the animals were live-action puppets, but for the toucans and snakes, we had to build new computer-generated creatures within the same style parameters set down by Sonny in his concept designs, and then animate them as if they too were puppets. We used a combination of key-framed and simulated movement to keep the CG animals feeling physical and in sync with the other puppets.

All the puppets were shot on a green screen - and although keying and rotoscoping these off to combine with backgrounds would be fairly standard stuff - in this case, the addition of AI complicated things as the edges of the puppets would change constantly and unpredictably with each AI iteration. So the project became a multistage process to keep all the elements usable and blended together within the same painterly style.


LBB> Could you elaborate a little more on the “machine-learning artificial intelligence techniques”? And did they pose any interesting challenges?

ETC> There is a huge wealth of knowledge and resources out there in the public domain regarding AI generative tools. Our challenge was to find suitable techniques that could not only deliver the creative results required but also be controllable enough to work to a brief and be scalable enough to work within a VFX pipeline. We applied a mixture of processes from GAN Python libraries to new machine learning tools built into the latest versions of our compositing software. Some AI tools attempt to copy given source images, and others take simple text input to generate original content based on pre-trained machine learning models. 

In the end, our most promising results came from an algorithm that simply took a text input, and we fed the line ‘A Henri Rousseau painting’ into the prompt. It sounds almost too simple to be true, but there was a lot more to it, to create the output that would work across all the different scenes.

 

LBB> How was the production process? Were there separate shoots with the puppets and the ‘museum’ cast?

Andrew> From pitch to completion, it took about six weeks (an extremely accelerated timeline). We built the gallery on a sound stage in LA and shot the ‘real world’ with actors and the puppets on green screen, over the course of three production days. The cast was a delight to work with, and we played ‘90s rave music to get them in the spirit of the film. The puppeteers were also fantastic to work with, and the puppets Michelle and her team made kept us all laughing on set. It was a blast.




LBB> Is there anything you learned from the project?

Andrew> This was my first time using machine learning to create an aesthetic for a spot, and I was amazed and spooked by the accuracy of the AI when it was given simple text commands to recreate the style of an artist making work over 100 years ago. I also learned that, with the right rigging, you can make puppets do anything, including making a flamingo twerk. 

ETC> We learned many new and interesting things about the world of AI generative content creation that don't normally factor in our VFX workflows, in which most challenges have well-established workflows and paths to success. Taking on a creative project using the power of AI to deliver the results, in such a tight turnaround, takes a huge leap of faith and collaboration from all involved. Everyone had to be open to the idea that there would be some things we could control and others we could not, and to embrace the happy accidents. The director and creative team were incredibly supportive and collaborative on this journey.


LBB> Everything in the ‘metaverse’ painting seems to be vibrating and moving with the beat of the song - was this intentional and how did you achieve this?

Andrew> Yes, this was intentional. I wanted the whole universe to feel alive, chattering and bouncing. The AI was intentionally designed to give the jungle that painterly frame-by-frame boiling effect, similar to ‘80s rotoscope techniques in Enya and Pat Benatar music videos. We spent time making the CG bananas, leaves and trees move to the music along with the animals. 

 

LBB> Were you involved in the editing process? What was the style and creative direction in terms of editing?

Andrew> Yes, we worked with editor Andy McGraw at Cartel to craft the edit for the film. He was a pleasure to work with, and we worked closely together to design the film into a three- act journey of discovery, immersion and final climax.

 

LBB> What was the most difficult challenge you faced on this project?

Andrew> I think designing puppets in such a short time and creating the AI look was unprecedented for us. Combining a traditional technique with a new one, in which there is a chaos factor, was risky but rewarding in the end.

ETC> The two most significant challenges were taking these nuggets of AI wizardry that were often intended to be run as curiosities on simple websites for generating individual low-resolution images and then building seamless tools into our VFX pipeline and enabling artists, who knew little to nothing about coding and machine learning, to operate these tools easily and at scale using our in-house computing power.

The other challenge was in taking the output of these processes and applying traditional compositing techniques to blend CG-generated content and live-action content, while both preserving and also controlling the irregularities introduced by the AI.


LBB> Do you have anything else to add about this project?

Andrew> It was a delight to work with the creative team at Droga5 and Meta marketing and creative. My hats off to them, specifically Mike Hasinoff, Thom Glover, George and Tom McQueen, Sarah Karabibergain, Benjamin Hinamanu and Deepa Joshi. It was also a real pleasure to work with the ETC’s leading team: Simon French, Ryan Knowles, Dan Yargaci and Antonia Vlasto. 

Additionally, I want to give a shout to my team at Object & Animal: EPs Emi Stewart, James Cunningham, Morgan Clement, Dom Thomas and Justin Benoliel, plus my terrific producer Stine Moisen, editor Andy McGraw at Cartel, music supervisor Sunny Kapoor at Curation Music, sound designer/mixer Ed Downham at King Lear and my crew: DP Andrew Yuyi Truong and production designer Evaline Wu Huang.



Credits
Work from Object & Animal
23
0
10
0
9
0
ALL THEIR WORK