Live Theater in Virtual Reality (VR)
In July, a group of theater and XR creators approached me about my #FutureStages template with interest to rehearse and perform a live theater show in open source, browser-based VR (on Mozilla Hubs). Within eight weeks, we staged and presented my one act play, Jettison, for multiple live virtual audiences “at” No Proscenium’s Here Festival.
Jettison is a one act play originally penned for a festival that staged new theater pieces in a Manhattan rooftop swimming pool. The play was subsequently published and performed in more traditional venues, but became seen as “un-producible” given the limitations of staging a story about three strangers stranded in a lifeboat. Our team thought the play’s challenges in physical space might help highlight the new possibilities of mounting theater in virtual reality.
We used a black-box version of the #FutureStages templated space for the venue, with Mozilla’s default avatars for the main characters, and customized 3D “props” and “puppeteer” avatar that matched real-life physical props and costumes. This allowed for two “modes” of performance. The first was entirely in virtual reality, with our actors wearing PICO Neo 2 headsets, rented from LBX Immersive for the run of the show.
For the second “mode” we wanted to showcase the potential of “mixed reality” to place live video feeds of our actors “on” stage. This originally utilized compositing/arranging separate web cameras together, but we later opted to bring all the actors into a unified background and stream the final composted image from our stage manager into Hubs.
To do this, we used OBS Ninja to allow our actors to turn their smart phones into a live video feed for our stage manager to composite in Open Broadcaster Software and stream into Mozilla Hubs using the built in “virtual camera” in OBS.
Mozilla Hubs is built on java script, which allowed us to get sneaky and “inject” script into the developer console to force interactions that are not part of the Hubs platform. Long term, we would be interested in exploring this further to trigger cues for the narrative storytelling and audience experience.
Collaborators and audiences joined via their computer web browsers, smart phones, and VR headsets, successfully hosting over 30 synchronous attendees for each show. This creates a compelling use-case for ubiquitous access for anyone to watch and create live storytelling. We’ve presented our findings and held open rehearsals and performances “at”:
After the show, I asked each of the team members to record their experience working on the production and provide any clips or screen recordings from our rehearsal process. I curated these materials into the above docu-mercial for our show and workflow.
As we continued to explore the possibility of continued and future projects on the platform, I was inspired by the Museum of Other Realities to use Mozilla Spoke to create a “living exhibit” where we could invite potential partners and collaborators to interact with the space and learn about the elements used in the live show. Transparent PNGs give the illusion of engraved surfaces, GIFs loop moments of the show, a Spawner allows anyone to grab the rabbit prop or a page of the script, a THREEjs point light creates dynamic lighting, a model of the set design lives on the table. I re-designed the lobby area of #FutureStages with these interactive objects and we curated our information on a dedicated website Jettison.LIVE as a simple point of entry for future rehearsals.