Single-take, dance-focussed music videos are, when well crafted, as much of a treat for the audience as they are a painstaking labout-of-love for the director and team. However, if your dancer is a giant, anime-influenced cat and the surrounding she's scooting around only exist in a computer, things get a bit more tricky.

For his latest music video, director Jake Schreier partnered with The Mill to take his talent for highly-choreographed storytelling to the next level. 

Created for Norwegian musician Cashmere Cat's single Emotions, Schreier and team matched groundbreaking motion-capture technology with choreographer/dancer Margaret Qualley's endearing performance to create a visually stunning promo that captures a playful kitty bouncing around a sun-kissed glade... all captured in real-time.

Cashmere Cat – Emotions

powered by Source

Unlock full credits and more with a Source + shots membership.

powered by Source
Show full credits
Hide full credits
Credits powered by Source

To enable this, the whole shebang had to be filmed on a motion-capture stage, with a cutting-edge virtual production pipeline required to deliver a seamless relationship between the real world and the simulated: allowing for a display screen to render the CG character, landscape, props and structures during the shoot. 

Effectively allowing Schreier to direct the CG character as opposed to the live-action performer, the tech gave the filmmakers the opportunity to adjust the character actions, the lighting, and even environmental textures, all whilst still on set.

We were fascinated with the process, so sat down with some of The Mill's top talent - Technical Innovations Manager Tawfeeq Martin; Character Development Creative Director Lisha Tan; Technical Artist Troy Barsness; Creative Director Glyn Tebbutt; and Executive Producer Aurelien Simon - to find out just how they did it.

How did you guys get involved in the project?
Is it technology you'd been prepping for another purpose?

Tawfeeq Martin, Technical Innovations Manager - The Mill had already been utilizing Epic’s Unreal Engine on projects such as The Mill Blackbird and more recently, Mill Mascot.As we progressed from a character animation system using hand and facial gestures, the natural next step was to take on a project that required full body capture. 

The initial conversations for Emotions started due to an existing relationship between Glyn Tebbutt, Creative Director, and Jake Schreier, Director, who had collaborated previously on Showtime’s episodic hit Kidding

We were thrilled to work with such an exciting team and use our technology to bring to life Jake Schreier’s unique vision, Cashmere Cat’s incredible music and his Princess CatGirl concept, and Margaret Qualley’s choreography and performance. 

Above: Schreier and Qualley on set

What was the design process for Princess CatGirl?

Lisha Tan, Creative Director, Character Development - For the character development, our design team worked closely with Magnus Høiberg aka Cashmere Cat and Jake Schreier on some fascinating references they provided, ranging from Japanese anime to early ‘00s games. 

Ultimately, we wanted to craft a cat character that incorporated Nordic influences, as Magnus is Norwegian. 

Designers Sasha Vinogradova and Sidney Tan worked on a couple of rounds of sketch explorations and Magnus refined what he liked from was presented. Sasha then sculpted the final character in ZBrush, which was later translated into the Unreal Engine and helped the director visualize the form and dimension of the character for approval.
Troy Barsness, Technical Artist - It was great to engage in the development of the character and story from the very beginning. Designer Ed Laag played a big role in taking Sidney’s initial environment concepts and fleshing out a detailed version of the 2D concept as well as some initial 3D models. Ed’s understanding of spatial 3D, as well as the concept, helped bridge the gap from 2D to 3D smoothly. 

From there, Dave Witters and I were off to the races refining our way in "unreal engine" prepping for the virtual set mocap shoot.

Click image to enlarge
Above: Some of the Princess CatGirl designs

What was the prep process?

Glyn Tebbutt, Creative Director - The concept of the music video is to show the cat dancing alone in her fantasy world, before pulling out to a final scene that reveals that Magnus (Cashmere Cat) and our performer, Margaret, are actually sharing a dance. 

Jake Schreier’s work is all about impressive single takes and single-camera moves. It was crucial for us to ensure this technique remained possible for Jake, even when we cut from the augmented world to live-action. 

Therefore, our creative approach was to shoot a motion-capture performance and to implement a virtual production pipeline to enable as much creative control on set as possible.

Troy Barsness, Technical Artist - We knew an ambitious virtual production pipeline would be required to deliver a seamless relationship between the real world and the simulated, for everyone on set. 

The process required a two-step approach. The first step was to get the general layout and as many assets in place to create the look. The environment was then lit, textured and elevated to create something as close to finished as possible. 

Once everyone was happy, we would test the interactivity and refine it. Finally, we were ready to shoot! 

Tawfeeq Martin, Technical Innovations Manager - Investing a little more time on fine-matching the physical and virtual inspired huge confidence in our bringing rehearsed choreography to life. 

A great example is when Margaret approaches the rigged high bars and apple boxes and out comes Princess CatGirl taking the plunge, swinging from a tree… and nails the landing too! 

The practical lighting was a good match also, and the shimmering sunset across the lake definitely set the emotional backdrop to the final reveal.  

Cashmere Cat – Emotions - Behind The Scenes

powered by Source

Unlock full credits and more with a Source + shots membership.

powered by Source
Show full credits
Hide full credits
Credits powered by Source

How long did the shoot take? 

Tawfeeq Martin, Technical Innovations Manager - The shoot itself predominantly took place over one (very long) day! 

However, the prep time that went into making that possible was extensive and covered multiple weeks prior to shooting. 

Were there any unforeseen incidents?

Troy Barsness, Technical Artist - Given that this project is a music video, the timing was everything! The dancer’s movements had to match the beats of music. 

Every step had to be synched up and match perfectly. 

Tawfeeq Martin, Technical Innovations Manager - In our traditional pipeline, the director and creative teams would be reviewing still images or sets of renders. 

In this case, the first time we saw the character come to life, on set, it sparked immediate suggestions, improvements and iterations from the filmmakers which we could immediately respond to. 

How much could Jake see 'live'. Were many tweaks made after the shoot?

Troy Barsness, Technical Artist - A display screen rendering the CG character, landscape, props and structures in real-time, allowed Jake too, in effect, direct the CG character as opposed to the live-action dancer, as she moved through the stylized world while being shot with a handheld camera. 

This allowed everyone to adjust the characters' actions, the lighting, and even environmental textures, all whilst still on set, which meant we could try out different iterations without lengthy waits for rendering. What is captured on set is near-final. 

The final stage is clean up, additional character modelling, and iterations, such as adding volumetric mist and fog. They were all very minor adjustments. 

For the layman, what's the difference between this technique and 'traditional' computer-generated animation?

Tawfeeq Martin, Technical Innovations Manager - The traditional technique requires development, lighting, modelling, effects and various other separate elements that are passed between different artists and platforms and rendered at each stage. 

In this technique, all elements exist together in one real-time environment; responding and reacting live. 

Above: An example of the performance and real-time capture

What are the limitations of this technology? Where would you like to see it improve?

Tawfeeq Martin, Technical Innovations Manager - As early adopters in working with game engine technology, The Mill collaborates closely with the creators on improvements and shared learnings, to ensure the tools become increasingly artist-friendly. 

It’s a shared and supportive experience to work with Unreal Engine in pushing virtual production in this innovative way.

What do you see as the future of this technique? What would be your dream gig involving it?

Aurelien Simon, Executive Producer - This reimagined approach to filmmaking still allows for traditional cinematic methods. Creative partners are able to see iterations, makes changes and experiment all in real-time. 

In terms of dream gig, we’re just excited to engage any creators that aim to deliver ground-breaking, real-time stories for a truly immersive experience, both for the audience and their process!