We sat down with Luc Delamare and Kevin Stewart, two DPs from LA who made the jump into virtual production, gathering their insights and discussing the ties between traditional cinematography and real-time filmmaking. Below is the first section of a two-part interview focusing on their introduction to virtual production.
If you’re interested in getting into virtual production and the Unreal Engine yourself, make sure to read our own introduction to real-time filmmaking.
Table of Contents
- Creating a real-time short in the Unreal Engine: getting started
- What can cinematographers bring to the real-time table?
- How did this first virtual production go?
- Will virtual production change the way we write movies?
- What advice would you give to live action cinematographers interested in virtual production with the Unreal Engine?
Creating a real-time short in the Unreal Engine: getting started
Welcome to both of you and thank you for joining us today! And congratulations on winning the Real-Time Shorts Challenge. How did you feel about going into Nemosyne as your first virtual production project and winning the whole thing? How did you start?
Thanks Alvaro! As for how we did it… well, we knew we couldn’t just learn everything about it, and I had just barely installed the Unreal Engine. I was like “whoa, I have no assets!” and this project was an opportunity to have high quality, production-ready assets.
As cinematographers, one of the things we talked about going into this is that we needed to figure out lighting. It’s all about lighting. We kept on hearing about real-time, we had this previous knowledge of other render engines that weren’t real-time, and so that was super interesting to us. We could see our lighting right away, which as cinematographers is crucial.
Yeah, just to jump off that… I knew Blender, I was doing some stuff in Cinema4D and I even did some stuff in Cine Tracer, which is Matt Workman’s previz software. There was something there but it wasn’t quite photo accurate yet in terms of how close to reality it was for lighting.
With Nvidia’s ray tracing coming along, we both bought 2080 Tis (I specifically bought one for this project), and that kind of blew us away: how the shadows fell off, how soft we could get them, using reflections, using bounced lighting, GI, all of that in real-time, at a photo accurate level, was mind blowing. We could light things like we would on set. “Let’s put this here, let’s wrap that light, let’s bring up the background”, we could play around like that as if we were on set. For us that was an eye opener.
The fact that you’re both cinematographers getting into virtual production is an interesting part of the project. Though you both had some experience working with VFX: you mentioned Blender and you also mentioned working with Cine Tracer. How would you qualify your previous experience with virtual cinematography? And did you have any experience at all with real-time filmmaking? What drove you to the Unreal Engine?
I think we were both aware of it and we knew that there was potential, we were seeing tutorials pop up and I thought, “oh this is something that is achievable”. But in terms of previous experience, no, nothing like that. In Blender, I had finished a few things where if I were doing productions on my own workstation, I would wait for weeks.
For me it was Cinema4D , and an offline renderer called Redshift, which is great. But no, this was a whole other step in that direction. For us as cinematographers, we’ve been wanting to get closer and closer to real-time. And finally, with Unreal, with a 2080 Ti and ray tracing, we feel like we’re there, as far as being able to light intuitively like we would on set.
There’s one thing we didn’t talk about that is worth mentioning. One aspect of Unreal Engine and Epic Games that got us both excited was Quixel and Megascans. We normally would have nothing to do with Megascans and scanning in general for level design, but because Epic made it free for Unreal Engine… We realized that if we had access to that and we knew how to light these things, we could build worlds so quickly. The pairing between Unreal and Quixel was definitely a big exciting thing.
What can cinematographers bring to the real-time table?
How does your experience as cinematographers, as DPs, translate into Unreal Engine? What do you think that you can bring to the table of virtual production as cinematographers? Something that people working with the Unreal Engine don’t necessarily have?
Well for one, experience, right? We’ve been DPs for over 10 years. We’ve been working in professional shoots and stuff like that, so we’ve gained 10 years of experience, getting better knowledge, working with actors, with directors. All of that stuff. And also experience with storytelling on set, which is important. Even though this doesn’t have much to do with lighting and cameras and cinematography per se, they kind of go hand in hand.
And so when we were building the short, we were thinking a lot about the story, how we were conveying our story and how the images would progress, how we would be able to try and get a little bit of an emotional reaction out of the audience. So I think that being filmmakers first and foremost was a huge boost for creating a short.
But then, of course, as cinematographers we were able to do what we do on set with lighting and cameras, and project how we wanted the mood, the tone, the shadows to fall, that sort of thing.
I feel like piggybacking on that. There were always two parts to our conversations when we were thinking about making the short. It was always story first. But at the same time, we were learning Unreal Engine, learning how to get the best looking result, because naturally we’re learning the software and we want to make good looking visuals. Still, because of our backgrounds as filmmakers and storytellers, story was always at the forefront.
I think that we both understood that from the beginning and that’s why we were always on the same page. Traditionally, two cinematographers don’t work together, but because there was this mutual understanding of story and visuals going hand in hand, and we had the background to support that, it worked.
How did this first virtual production go?
Talking about the story itself, how did the writing process go? Did you have any references or inspirations for this short? Did you have any constraints or any objectives that had an impact on the way that you wrote it?
Well, it’s funny because it was a very iterative process. It started out with us talking about different ideas of what we thought we could achieve without even knowing Unreal Engine yet. So we just talked about maybe getting the Grace model. She looked really good, so we thought we could maybe do some macro close ups and do a title sequence like Westworld or something like that. Just really nice close ups, good lighting, that sort of thing that we can probably do, and do it well.
From that point on, we got to know the engine more and more. And we started trying different things, getting more ambitious, adding new locations. There were some restrictions, for example we didn’t know too much about animation, we just ended up using Mixamo for the most part. We ended up using very simple emotions because being more ambitious, as far as animation goes, wasn’t passing our bar of quality. So that was one of the rules we kind of put in: if it didn’t look good enough in the animation, then we wouldn’t put it into the short.
We just kind of collaborated bouncing ideas off on how to best communicate this story to the audience. It just kind of jumped from level to level. We didn’t have an idea from the very beginning that carried through to the end. There was an evolution to it, you know what I mean?
And part of that, too, is that we would never have gotten the result that we did without also learning at the same time, because both of us were taking turns learning about different environments. I started experimenting with this cave environment, and Kevin started with a forest. That’s when the story, like you said, evolved. So it was also just about what types of environments we wanted to shoot. And we had a list of other places we wanted to incorporate and didn’t really have time to get to. But I feel like a huge part of the story process was just the fact that we were also learning.
You asked about limitations and Kevin addressed this, but we just knew we couldn’t incorporate other characters, we knew it was just going to be her. We knew there was a limit to how much walking she was going to do, and it kind of turned into just lighting environments. And we knew we wanted to light a variety of different sets so that I think that was a big thing.
Yeah, and different looks too. Our inspiration was grabbing frames. And again, we had to be limited with what we chose because we couldn’t just do anything we wanted. We picked different movies that we were inspired by, we had a mood board. There was some stuff from The Revenant and some stuff from Blade Runner. At one point there were all kinds of different lightings, looks and cinematography that we liked. Sometimes we didn’t really recreate shots per se, but we wanted to capture the tone of some of our favorite shots. And what was really cool about Unreal is that just as we learned more, we’d add another layer of story and then we would learn something else and we’d think, “well, what if we did this?”
At the end we wanted to come up with an emotional beat and that was the musical box, because we had this memory as a motif. And it was just a great way of working, actually. It’s a very different way of working, instead of just writing something it’s like: “we feel good with this. Let’s go make it. Let’s edit it. Let’s put it out there.” This was a very collaborative, “on the fly” way of filmmaking. That ended up being a really great experiment.
Will virtual production change the way we write movies?
Do you think that this very iterative way of writing a short is really just for this format, because of the time constraints, and because you’re learning? Or do you think that this way of collaboration has a future and can be a way of working with Unreal?
It’s interesting, right? Because on set, you kind of have to commit to something. There’s budgets, there’s producers, there’s actors, there’s schedules that everybody has to kind of commit to. Here, you can look at the story, you can render out a shot. And if you think: “we can redo this shot from a different angle, I think that will communicate what we’re trying to say here a little bit better”, you can jump back in the engine and flip the camera around or just put her in a new environment altogether. And you can do that very quickly. So I think there is something that is conducive to making the project better, because you can always go back and continue to make it better. And we did that to a certain degree. But I’d have to experiment it on a bigger narrative project to really be able to answer that.
I feel that, yeah, the way that virtual production is being pitched by bigger productions in terms of revamping how traditional VFX and heavy CGI movies are done. We’ve seen the flowcharts redoing the production VFX workflow where it’s all a continuous loop. I feel like what happened on this short is this new evolution in a very small form factor, where it’s taking the best of live action filmmaking and an aspect of CGI filmmaking in that we have almost unlimited options, if not unlimited options.
For a short like Nemosyne, we had the flexibility to tell a story as if we had shown up on a location. We would do all these things that we do on a physical set normally and have the unlimited options of Unreal Engine: load whatever environment we need, lighting, anything we want. It turned into an organic thing where there were these levels of the project, where we change the story as we learn.
For a short like Nemosyne, we had the flexibility to tell a story as if we had shown up on a location. We would do all these things that we do on a physical set normally and have the unlimited options of Unreal Engine: load whatever environment we need, lighting, anything we want.
The initial question was if this type of collaborative filmmaking process would scale up… I don’t know if it would for a feature film but I do think that it is going to change the way these movies are made. I do think we’re going to take the best of both the old and the new, that just hasn’t happened yet with full CGI.
It’s something that’s very interesting to think about because you have traditional filmmaking and the CGI pipeline kind of merging together into one brand new kind of thing. And that allows for just continuing to work on something at a very fast pace. It’ll be really interesting to see what kind of projects come out in the near future, because I think it does afford for better storytelling at the end of the day. You’re not locked into a certain shot. You can redo it or think it over.
What advice would you give to live action cinematographers interested in virtual production with the Unreal Engine?
Do you have any tips or any advice that you would give to anyone who is getting into Unreal Engine, into virtual production?
As far as tips go, well… When we first started, both Luc and I had some experience with VFX, Cinema4D, Blender, that stuff. Still, we were kind of lost at first. Where do we start to learn? It’s kind of such a big engine. There’s so much you can do. It’s hard to even know where to start.
The biggest thing to do is just download the engine and start messing around, because until you do that, you’re going to just feel overwhelmed. Every little day you’re going to make a little bit more progress, you’re going to learn a little bit more. And at first it’s going to be overwhelming, but eventually you’ll start to pick up on stuff. There’s a bunch of tutorials, tons of content out there. Eventually you’ll get to it.
Now, if you want to do lighting or if you’re a DP that wants to get into Unreal Engine, the first thing you want to learn is all the ray tracing settings because that’s the most important thing you want to learn as a DP. It takes some time to get used to what the ray tracing settings do and how they look best, in what situations they look OK. For example, indoor GI is tough. You should know how to work around that. And if you’re just looking to create an animation, it’s a whole other thing that you may be learning, going into mocap and trying to learn some of that.
The good thing is, the engine does allow for you to learn pretty much any part of filmmaking and video gaming that you want to learn. There’s not many restrictions: if you want to do level design and great environments, that’s also great as far as virtual production goes. You might want to think about buying a body camera or buying an HTC Vive or some kind of virtual headset that allows you to pair the camera to a virtual camera, that’s a pretty important thing as well.
Just just like Kevin said, there is a learning curve with the Unreal Engine. As excited as we were to get started, like we said at the beginning, it was still like, “wow, can we even do this?”. I think it took us like a day or two to even figure out how to migrate the Grace asset into our own project file. It can be tough at first, but tutorials do everything. I’m entirely self-taught in After Effects and Blender from tutorials. So naturally, we just knew that it was all about ingesting that knowledge and staying excited about it.
I would say to anyone looking to get into it, focus on your strengths and what you’re curious about in the Unreal Engine, because it is a massive piece of software. Just like any 3D pipeline or 3D suite, you can do a lot within it. Definitely choose a focus and start with that because otherwise it will look way too big. If we had started thinking about learning motion capture and facial animation and this and that, there’s no way we would have done a short film in 30 days. Having a narrow set of rules, not to limit yourself, but to focus, is really important.
Stay tuned for part 2 of the interview where Kevin and Luc will discuss the use of virtual cameras, lighting in Unreal and dive deeper into their virtual production experience.
Kevin Stewart Born in Paris, France and raised in Portugal until the age of 17, Kevin’s early fascination with movies was so strong his parents moved the family to Portland, Oregon to help facilitate his dream. After graduating from the School of Film and Television at Loyola Marymount, he spent several years as one of the main shooters for Funny or Die, working on projects with Will Farrell, Don Cheadle, and Steve Carell. His commercial work includes brands such as Starbucks, Adult Swim and Walgreens. Kevin recently shot the Rosa Parks biopic Behind the MovementBehind Movement for TV One, Blumhouse’s Unfriended: Dark WebUnfriended: Web and the award-winning indie success The Head HunterThe Hunter, which he co-wrote and produced. Luc Delamare Luc Delamare is a Los Angeles based Director of Photography whose initial passion for film developed as a self taught visual effects compositor. Born in Silicon Valley, Luc Delamare grew up splitting his time between his hometown of Los Altos and Southern France. Following his formal education at the School of Film and Television at Loyola Marymount, Luc has built an extensive portfolio that encompasses a diverse array of formats and genres, including feature film, documentary, web series, music videos, and now virtual production. Luc’s commercial DP work has featured personalities such as Olivia Munn, Alex Morgan and Canelo Alvarez. As a VFX supervisor he has worked with various clients including Apple, NASA, Western Union, Vivo, and 20th Century Fox.