Performing with Technology
This fall we launched a new CultureHub class at the Seoul Institute of the Arts called Performing with Technology. In this class we worked with performers, specifically acting students in their third year of study at SeoulArts. We left the title intentionally open, to give the class room to be adapted in the future, both in response to our own research and to what is happening in the world. The course is designed to expose students to emergent creative practices, putting them in an experimental setting that uses modern tools and technological workflows. The hope is to lay a foundation for the students that positions them as active learners ready to dive into unfamiliar territory as they move into the professional world.
As the Artistic Director of CultureHub New York for the past 15 years, I have had several opportunities to use VR in creative projects. To be honest, I have always been a bit skeptical of VR in the sense that I’m not very excited about the prospect of having meetings or dinner parties in the metaverse. However, at CultureHub, we feel a responsibility to examine new technologies from critical and creative perspectives. That means we have to work with these tools to understand their nature and potential. As a result, we have had many opportunities to work with virtual reality in various projects and contexts.
In the spring of 2023, we worked with CultureHub Associated Artist Krzysztof Garbaczewski, a Polish theatre director and founder of Dream Adoption Society, on a project called The Books of Jacob. Krzysztof has been integrating VR and theatre for several years, but on this project he was especially interested in developing new methods for streaming live video in and out of VR. CultureHub Creative Technologist Sangmin Chae, a graduate of SeoulArts, worked with the Dream Adoption Society team to integrate Networks Device Interface (NDI) technology into the production video system.
We realized that combining NDI and VR opened up new possibilities. Krzysztof is working within a social VR platform called VR Chat which offers some amazing advantages but also, like any technology, has its drawbacks and limitations. On the plus side, it allows for multi-player functionality over the internet which allows performers and audience members to join the virtual aspect of the show from anywhere with the necessary VR gear and a stable internet connection. However, because it is an existing platform, integrating more experimental elements, in this case NDI video, can be cumbersome.
There are really just two major platforms for building VR worlds, Unity and Unreal. These software are called game engines and they primarily exist as tools for game developers. However, many artists are beginning to leverage these tools for other things including installations, performances, media art production, and more. Excited by the potential these software hold but frustrated by the limitations of working with existing platforms, Sangmin Chae, CultureHub’s Creative Technologist, embarked on a three month development sprint to see if it would be possible to develop our own system for bridging real and virtual spaces using real-time NDI video. The results of this development were employed in the Performing with Technology class this past fall.
After moving back and forth between Unity and Unreal, ultimately Sangmin settled on Unreal for the final build. One challenge we faced is that real-time performance with VR is impractical to teach in a classroom setting because it requires so much equipment. It isn’t realistic to maintain a VR headset for every student in a class. We worked with Theatre Artist and SeoulArts Professor Ji Young Kim to design a curriculum that would slowly introduce the students to the technology over a semester-long period. In addition to the technical challenges of using VR in the classroom, the class was also being taught telematically between New York, where Sangmin and I are based, and Ansan, Korea where Ji Young and the students met for class. Drawing on CultureHub's 15 years of experience developing telepresence-enhanced learning environments, we were able to design a system for the class that included NDI enabled point tilt zoom (PTZ) cameras that could be controlled remotely from across the world, allowing for an amplified feeling of presence.
While the choice to use real-time NDI video was a creative choice, it also served to create a more natural and fluid way for performers to jump into a virtual space. Instead of needing a VR headset for each performer, we were able to merge the real and virtual spaces, allowing real world performers to beam into VR as live video avatars. We used chroma key techniques to extract the backgrounds so the 2D video avatars could be spatialized, scaled, and fully integrated into the virtual environment. The main concept was to create a system where live video performers could see and interact with performers puppeteering avatars in the VR space. This took a complex system of real and virtual world monitoring and signal routing. In the end, the performers in the studio were able to see the virtual space through monitors, while the avatars could see the live video performers integrated in the virtual environment. One of the student groups went so far as to add physics to their environment and in order to make it to the final interaction in their story; their avatar had to jump across a series of floating discs without falling into the void below.
The students started the course by using Open Broadcaster Software (OBS) an open-source software designed for livestreaming that allowed them to learn about live video, cuing, live processing, and digital A/V routing. Using OBS and video hardware the students first developed monologues, then worked with a partner to create a scene. After they became more comfortable working with live video, they went on to develop short performative pieces in virtual reality. They worked with Sangmin to design their virtual environments and avatars and, with his assistance, developed virtual sets that aligned with their project concepts. One group's project took place in the North Pole, another in an asylum, and one in a fantastical landscape of oversized floating objects. It was incredible to watch these young actors navigate the complexities of such a technologically advanced performance system. Their ability to embrace this new way of working was inspiring and many of the students reported feeling empowered and more confident working with these new tools. There is still a lot of room for growth for this course, but considering all of the hurdles we faced over the semester, I was extremely proud of what the students produced and very impressed with their willingness to jump in head first.
As Artistic Director of CultureHub, Billy Clark has overseen the development of CultureHub’s artistic, education, and community programs since its inception in 2009. With the CultureHub team, he has curated the annual Media Arts Festival, Refest, which showcases artists working at the intersection of art and technology. In 2013 he directed Paul D. Miller aka DJ Spooky’s piece Seoul Counterpoint, which premiered at La MaMa’s Ellen Stewart Theatre. Seoul Counterpoint was subsequently presented in conjunction with Asia Society’s exhibition, Nam June Paik: Becoming Robot in 2014.
At the La MaMa Galleria, he co-curated Mediated Motion, an exhibit of works that explored how new media technologies alter human movement and our perception of the body in motion. A graduate of the Experimental Theatre Wing at NYU, Billy has performed and directed in the downtown scene for over 20 years.
He is currently a professor at the Seoul Institute of the Arts, has taught at CUNY Hunter College, and has been a guest lecturer at Sarah Lawrence College, Gallatin, and NYU's Interactive Telecommunications Program. He was chosen as one of the 100 Top Creatives by Origin Magazine in 2015.