The Guardians of Justice is an original Netflix series combining live action and animated mixed media in a superhero satire. We talked to producer and VFX supervisor Erick Geisler about how they finished the show remotely—juggling unexpected roadblocks, extensive effects, a team spread across three continents—and how Evercast brought it all together.
This interview has been edited for clarity.
So tell us a bit about The Guardians of Justice.
The Guardians of Justice is the brainchild of a guy named Adi Shankar. Adi Shankar created Castlevania for Netflix, and this is like an alternate superhero justice league. It started off as a YouTube series, actually, and then Adi had a lot of success with this Castlevania show, and Netflix was interested in acquiring his next thing, which ended up being The Guardians of Justice. We ended up having to finish it through COVID, which made it really difficult to do additional shooting and stuff, so the solution that Adi came up with was to throw a bunch of mixed media into it. So there’s live action, there’s traditional cel animation, 3D animation, paper animation, and claymation. We had to come up with a lot of things that we could do without shooting, so it’s an interesting project in that regard. It has its own cult following. If you like anime and video game culture, then you’ll love it.
Awesome. What exactly was your role?
I ended up being the supervising producer of the whole show. I started out supervising digital effects and at the end of it, I ended up taking over the entire show and delivering it to Netflix. It was crazy.
Sounds like it! So you touched on this a little bit, but what were some of the unique challenges with this project?
There were a lot because we were working with artists all over the world. We had an animation team that was doing anime-style stuff in Spain—a company called Angry Metal. We were dealing with companies in Los Angeles and companies overseas, so because of that, the technical challenge was just keeping everybody synchronized and keeping everyone on the same page. And that was really difficult to do. We ended up sort of standardizing a ShotGrid database with Autodesk. We were finishing in Flame, a company called Skulley Effects was doing the finishing, and then we were piecing out all of the animation and effects to different houses.
The biggest technical challenge was keeping everybody on the same page; that’s really where Evercast came into play. Just being able to jump into a room with somebody live was way more useful than other [asynchronous] products like Frame.io, because we had language barriers with a lot of these different vendors in different countries. The notes process is not the same as just being able to sit one-on-one and scrub down over to a frame and circle something and go, “I mean this,” and they’re like, “Oh!” That’s what Evercast allowed us to do. It just became, “Let’s jump into an Evercast session,” and we used it throughout the entire process. So editorial would jump into an Evercast session, finishing would jump into an Evercast session, and we would review everything that way. Mixing as well. Even color correction, which was very extensive.
How did you go about color correcting remotely?
I would say 98 percent of the color happened remotely. I was looking at it on an iPad and it was being broadcast out. Because it was a Netflix deliverable, we had to color everything in HDR. You don’t color in SDR and color in HDR; you color in HDR and then you just down-convert it to an SDR. So, I ended up getting the latest and greatest iPad, and Evercast gave us the ability to look at the HDR over the web, which was a big deal. At the very end, I did watch everything once at the color house, which was a Fotokem company called Keep Me Posted. But that was only after everything was sort of done.
That’s great. You mentioned the animation company was in Spain; where was everyone else geographically? And were they all working from home?
Yeah, everybody was pretty much working from home. The animation company was in Spain, the finishing company was in Los Angeles. We had a VFX team in Prague, we had another group in San Francisco, a company in India called Pangur that was doing claymation. And then there were different directors for each of the episodes. Some of them were local in Los Angeles. I’m based in Los Angeles. So it really was interesting just coordinating and keeping the ball rolling inside of Evercast. Between Evercast and our ShotGrid database, that was how everything stayed cohesive.
So what do you think was the biggest thing that Evercast brought to the table?
It allowed us to communicate, it gave us the ability to stop on a shot and annotate things, and the ability to be able to control the timeline for somebody while they’re watching something. There was color, there was post, there were multiple animation houses. And each needed different reviews with different people. Having one place where everybody could just jump in and get a point across really made it possible to facilitate the production.
Everything was done through Evercast. Watching the final onlines, dropping in the effects... It’s a really complicated show. There are lots of layers of stuff going on, so I don’t know how we could have done it without Evercast. It was the go-to place for all of us to sort of huddle together.
What do you think about the future of remote work when it comes to VFX and/or color?
I mean, we’ve completely changed the way we work because now there are tools—Evercast in particular —that will allow you to do so much remotely. I think the genie is out of the bottle; there’s no going back. This will be part of the workflow for everything. I think lockdown forced us to prove the technology, but now I feel really comfortable looking at HDR color on an iPad, you know what I mean? I even put it up against the HDR monitor in the color correction bay, and I’ve got to say, it was really close. Most people would not be able to tell the difference. It gives you the confidence to be able to do that stuff now. Typically, we would have to go in and see everything in the bay and on the screen, but this really gave me a different set of confidences.
Not to jump projects, but we took that into Echo 3 when I worked on that also. The showrunner, Mark Boal, was like, “Can I look at color remotely?” and I said, “Yeah, you can do it in Evercast.” He had an iPad that would present HDR, so they started reviewing color that way too, and he was very happy. I think it just opens the door to a whole new set of possibilities for workflows, especially in this day and age, with the way content is being driven, and as fast as content is going out. On The Guardians of Justice, we were really constrained because we couldn’t go anywhere and we couldn’t bring people together any other way. On Echo 3, it was a lot looser, but because of the speed at which content has to go out, and having a showrunner who needs to see everything, it was really important to have the confidence that this is going to translate through to what you’re going to see on your television. And it just worked. It works really well.