Virtual Production Bulletin: An Interview with Todd Bryant, NYU and Rashin Fahandej, Emerson College | by Srushti Kamat | August 2022


The Virtual Production Bulletin is our new Q&A series on the intersection of game engines, LED walls, real-time motion capture, visual effects, and animation with documentary and non-fiction storytelling. It is co-produced with the Co-Creation Studio. How to apprehend documentary techniques and ethics with new paradigms of virtual immediacy, abstraction or reconstitution? What systems enable and encourage such paradigms? And what other questions should we ask ourselves about the blurring of virtual and physical space?

Courtesy of NYU Tandon School of Engineering

Todd Bryant began teaching Virtual Production, a course for filmmaking and engineering students at New York University (NYU), at the start of COVID, when most educational institutions turned to online courses. Bryant was also tasked with piloting this new program with students with vastly different backgrounds in film and engineering.

What those new to virtual production often overlook is the looming urgency to groom the next generation of creators. To retain the current workforce and build a strong foundation for young artists, producers and engineers, integrated education is essential, especially as the worlds of games, video, television, film and emerging technologies will merge in the coming decades.

A graduate of NYU’s Interactive Telecommunications program, Bryant has worked at the intersection of engineering and art since the beginning of his career. He now runs the Virtual Production Studio at NYU. The following interview has been edited for length and clarity.

Srushti Kamat: Your course at NYU teaches both Tisch School of the Arts students and engineering students, right?

Todd Bryant: My classes are often 18 and four of the students will be from undergraduate film, seven from the Interactive Telecommunications program at the Tisch School of the Arts, and seven from the Tandon School of Engineering Integrated Design & Media. They have different instincts. Undergraduate film students, all of a sudden, find themselves with cameras that can do it all. They don’t have to drag them up the stairs. They don’t have to repair the lenses. They don’t have to worry about all those lights. They are so relieved by everything that they just absolutely gasp. They already know cinematic storytelling; I just give them a tool. When I teach virtual production in the game engine I try to use little to no code so it’s more about treating a real-time engine like an editing tool like you would DaVinci Resolve or Avid or Adobe Premiere. For these students, once they understand the fundamentals of computer graphics, it’s quite empowering.

SK: And the engineers?

VG: Engineers tend to have a predilection for UX, UI design, organizing things, or understanding chaos. They are more focused on code, logic, and math. While these students obviously naturally flock to some of the more code-heavy classes, in this class they come to treat themselves like artists. It becomes very experimental for them because they do it from a more innovative point of view. So they learn the rules by breaking them and then see what the film students do. They understand the underlying workings of the engine. Ultimately, it’s great to have them all as one big class working together because they each have inherent strengths.

Courtesy of NYU Tandon School of Engineering

SK: Are there any challenges?

VG: You definitely see some inequities when it comes to the first half of the semester. Some people will adapt to tools differently than others. Again, unlike a first semester engineering student, the second semester engineering student has probably already taken some sort of game engineering course and is therefore much more advanced. This class tends to be their first experience doing some sort of narrative storytelling.

SK: Can you tell us how you organize your lessons? How do you teach virtual production?

VG: The opening conference focuses on the history of virtual production. Although a new buzzword, virtual production is steeped in many historical concepts that are part of the language of cinema. We brainstorm and perform collaborative design sprints using the interactive whiteboard, Miro. Most of the content was developed during the pandemic.

We use an exercise that was developed at the Upright Citizen Brigade Theater in Chicago years ago. UCB felt that the Chicago comedy scene was too wacky and there were too many non-sequences. So they came up with this game called A to C, almost like Telephonewhere one person says something and the next person responds, then the next person responds.

But in this game, you say, “Okay, you said the word bird, and I have to say that A reminds me of B, which reminds me of C.” For example, bird reminds me of leaves, which reminds me of Paul McCartney. So Paul McCartney is now my C, and that would be the next person’s A. And the next person would say, “Paul McCartney reminds me of the Beatles, which reminds me of the infestation.” And so everyone takes their A and moves it to their C. So you’re coming up with a quick way to do free association, but it’s connected. And what you learn is to keep your B quiet in order to make B something personal about your own experience or something about you. It becomes second nature when you learn this improvisation technique, and so every word you spit out like a C that the next person picks up on A, is meaningful and important to you.

SK: It’s so much fun that you use UCB’s improvisation techniques. This shows what overlaps may exist between upcoming workflows and more traditional workflows like theater. Why did you decide to teach this course?

VG: As technical director in the field, I found myself having to teach too many teams that I joined. These were ground level items that were integral to the game engines. Instead of having to teach each production, I created a class as an easy way to bring them together so I could teach it once. There weren’t many classes a few years ago, so everyone was just doing a litmus test. All these companies would come up to me and say, “How can we adapt this? How can we move in there? I heard it saves money. I’ve heard it unleashes creativity. Can we take this?

Courtesy of Rashin Fahandej

Rashin Fahandej, one of Bryant’s Executive Course students beginning in June 2021, shares her perspective below.

Fahandej is an Iranian-American filmmaker and immersive storyteller whose work centers on marginalized voices and the role of media, technology, and public collaboration in generating social change. Fahandej is the founder of A Father’s Lullaby which questions structural racism in the criminal justice system. As an assistant professor at Emerson College, she teaches courses in emerging and interactive media and launched a pioneering XR co-creation initiative where students, formerly incarcerated fathers, probation officers and their children co- create personal documentary projects using AR, VR, 360° and emerging technologies.

Courtesy of Rashin Fahandej. Immersive and interactive audio video installation, volumetric video with Depthkit, stop motion animation with steel dust, tactile audio documentaries each representing the story of a formerly incarcerated father, and a geolocated participatory site. Link

SK: Why did you take the course?

Rashin Fahandej: When I started teaching in this field, there were very few departments that looked at the intersections of art and science. An example is John Hopkins University. It has always been important to consider an interdisciplinary and inter-institutional approach via research. I had studied virtual production in the field of emerging media to serve two main interests – first, as an educational tool with the potential to redefine the classroom and introduce new methodologies into higher education. Second, there was the potential for co-creation and collaboration. I use the Unity game engine myself and had a preliminary understanding of the Unreal engine. So I was also curious about how game engines could be used for collaboration between my students and community members. I wear many hats as a practitioner, researcher and educator in the field of emerging and interactive media. Our field is changing rapidly with advances in technology, so I’m excited to shape the field not only through experimental and groundbreaking approaches to storytelling, but also through opportunities to reimagine production methodologies. Innovative pedagogical approaches can help prepare the next generations of creators.

SK: What did you learn? What did you take away from it as an educator, as an artist and as a person involved in co-creation?

RF: There was a lot of wonderful learning! When it comes to technology, both as an artist and an educator, I’m concerned with the twin pillars of equity and accessibility. Both can be defined at multiple levels. For example, Todd’s class at NYU brought together an incredible interdisciplinary group of professionals from around the world. This access has been possible thanks to educational institutions pivoting to online and virtual learning during COVID-19. They are necessary environments for innovation and speculation and they are interdisciplinary in nature. The second pillar is equity. It is crucial that diverse voices and creators contribute to development so that the field can reach its full potential by being fair and just. To this end, my concerns and efforts involve integrating community members to work with emerging technology students in academic settings. For example, in my XR Community Co-Creation course, fathers/community members were zooming into the classroom with their cell phones because they didn’t have laptops. There is a lot of potential in virtual production, but when it comes to using the workflow and tools for community members, accessibility is limited and this is an important issue to consider.

What Fahendej is referring to is the usefulness of the game engine beyond a student space, echoing broader questions about accessibility and fairness. What might it look like, in theory or in practice, if the tool were introduced into communities that may not be as familiar or comfortable with emerging technologies? Taking note of Bryant’s course, what are the opportunities to use these platforms to foster cross-pollination between disciplines that might not otherwise be possible? Can we envision a future in which game engine and virtual production methodologies are iterated with these communities in mind?


About Author

Comments are closed.