A disembodied eyeball stares at us from the center of a massive spiderweb. The web’s threads emanate from its lids like extreme lashes, shuddering with energy as the eye blinks open and shut. Its pupil dilates with fear and intensity as its focus pivots, following the movements of a small, spritely being made of rocks. We’re in a cave-like enclave, inside a tree trunk, amidst a storm-swept forest. 

Actually, we’re on the couch, and a handful of performers wearing unitards dotted with motion capture sensors are in Portsmouth, England, in a small, square space, gridded with tape markings and sprinkled with black cubes. The performers are helping one another move onto these cubes and even each other’s shoulders in a futuristic trust exercise, blind to their actual surroundings by virtual reality headsets that place them in the fantastical forest their audience sees.

This is Dream, a research and development collaboration between the Royal Shakespeare Company, Marshmallow Laser Feast, the Philharmonia Orchestra, and Manchester International Festival, with funding from the Audience of the Future Consortium. Dream is a half-hour dance-theatre performance, loosely inspired by A Midsummer Night’s Dream, which ran from March 12th through 20th. The performers appear not as themselves but as a variety of fantastical creatures in a beautifully designed virtual forest. Their movements are captured and converted in real time into an evocative art style and live-streamed to audiences around the world on any internet-connected device. 

Dream plucks Puck and the four fairy servants from Shakespeare’s play and places them in the forest without the source material’s pesky human interlopers. There’s some sort of conflict with a growing storm, but it’s all a bit hard to follow. The visuals and performance methodologies were clearly more of the focus than narrative—and that emphasis on visuals does pay off. The forest is an ever-transforming environment, transitioning from green-dappled to storm-ridden to sun-drenched. The character design is also striking, turning constraint into creativity. Abstracted beings made of stones, blossoms, or bark, focus the audience’s attention on physical performance—and avoid the heavy processing required to motion capture facial movement. Other creatures have no body at all: one is purely face, captured via an iPhone camera and instantly transformed into interwoven roots; another is made of fluctuating, colored leaves. Each is inventive, evocative, and unique.

Yet, the piece still has room to grow in two aspects. First, the story. From the beginning, Puck’s mission is not clear or driving enough, so a sense of emotional catharsis is missing from the audience’s experience, leaving us marveling at the show’s visual and technological prowess rather than forging deep connections. Second, the interactivity is good in concept but lacking in delivery. Paid interactive ticket holders drop lightning bugs to help lead Puck’s way, and seeds to regrow the storm-swept forest floor. While conceptually lovely, the UX design meant that participants couldn’t be sure which lightning bug was their own, which limited any feeling of real contribution. A great participatory experience, whether live or digital, is defined by giving participants true agency and allowing the audience’s actions to change the experience. This interactivity didn’t quite accomplish either, but it did prove that the technology was ready for concurrent real-time input without crashing at a decently large scale, as audiences ranged from 3,000 to 8,000 people. 

That said, this united collaborative team created an extraordinary experiment in real-time visuals, evocative motion capture, and audio experimentation. Philharmonia-recorded instrument samples, tied to performers’ physical movements via Gestrument Pro, ebbed and flowed in intensity in direct proportion to the actors’ physicality. And it’s wonderful that the creative team integrated the performers’ experiences into their development process, such as taking input on how puppeteering Cobweb’s eyeball 1 felt better when the actor used their full body to define dilation or blinking. Performers, naturally more in touch with their physical experiences, have much to teach us about what will and won’t work in these new hybrid methodologies. It’s exciting to see an experiment of this kind accomplish so much from the perspective of integrating immersive technology and performance, even if it fell a bit short in terms of story and interactivity. Imagine how powerful an experience like this could be once the story being told conveys deep emotional power! Until then, it’s worth watching the experiments this united creative team develops together, as they’re at the forefront of a very exciting space.

1Fun fact: Shakespeare actually made up the word eyeball—in A Midsummer Night’s Dream! Oberon says “Then crush this herb into Lysander’s eye; / Whose liquor hath this virtuous property, / To take from thence all error with his might, / And make his eyeballs roll with wonted sight.”