Skip to main content
Back

The technology behind 'The Tempest'

In the Royal Shakespeare Company’s latest production of The Tempest, the character of Ariel is partly recreated as a digital avatar while the actor performs live on stage (right of image). Motion-capture technology is used to turn the actor’s movements into an animation that flies, spins and whizzes around the stage © Intel

William Shakespeare’s The Tempest is a fantastical play that features illusion and otherworldly beings. Technology journalist Richard Gray spoke to Sarah Ellis, Director of Digital Development at the Royal Shakespeare Company, Ben Lumsden, Head of Studio at Imaginarium, and Tawny Schlieski, Director of Desktop Research at Intel, about how cutting-edge technology has brought the magic and spectacle to life on stage.

‘On a ship at sea: a tempestuous noise of thunder and lightning heard.’ From this opening stage direction of William Shakespeare’s CAPTURING THE ACTION

Motion capture itself is not all that new. It has been 16 years since Andy Serkis first appeared on cinema screens as the snarling and snivelling animated character Gollum in ILLUSIONS ON STAGE

Adapting a technique that is usually used in feature films with long production times to a live stage performance was no easy task. The technology had to be robust enough for eight shows a week in a live environment and work with all of the other technologies used on stage. Initially, the team had imagined projecting the character of Ariel onto the stage while actor Mark Quartley, who plays the ‘airy sprite’ in the production, would perform the role in a room backstage. It is something that had been done before during UNREAL ENGINE

First created by video games developer Epic Games in 1998, the Unreal Engine technology has become one of the most popular tools for creating realistic three-dimensional graphics for games. It features a set of software tools written using the C++ programming language that allow users to animate digital characters, build virtual worlds and replicate realistic physical interactions between animated objects.

For CHARACTER FROM A COMPUTER

Intel had to custom build four different computer systems to cope. The first purely receives the data from the sensors in Quartley’s suit and then passes it to another machine that maps the data onto a digital 3D model of Quartley’s body, built up using body scanning. This is then superimposed onto the digital avatar using advanced graphics software known as Unreal Engine, developed by Epic Games for use in high-end video games. This allowed designers to play with the look of the character so that Ariel can appear in different guises throughout the play. Another computer system drives the theatrical control system, ensuring that the graphics produced by the Unreal Engine appear on the right part of the stage at the right time. A final machine powers the video server that is connected to the 27 high-definition laser projectors that show the final images on the stage.

Intel built two identical machines that ran in parallel so that if there was a problem, such as a glitch in the Unreal Engine that would need a restart, the computer engineers could switch to the other machine, which would be in sync. This means that there were eight computer systems whirring away in the theatre to produce the digital character of Ariel. The computers driving the projectors had 120 Intel i7 cores placed in server racks – a machine staff nicknamed the ‘Big Beast’.

Mark Quartley

To capture actor Mark Quartley’s facial movements in real time, Imaginarium built a system to attach a motion-capture camera to his head. Special make-up is used to highlight key parts of his face and the movements are then fed into the computer system and reproduced on his digital avatar © Intel

One of the first challenges that the Intel team faced was making sure that the hardware did not melt, which meant keeping the computers away from the sweltering glare of the lights and using a lot of fans to keep them cool. In a theatre performance, which relies on keeping the attention of the audience on the stage, the buzz of these fans and hum of the computers could have been a distraction. Each of the computer servers had to be packed with soundproofing and moved as far away from the stage as they could be. The team also faced another noise problem: the 27 projectors sitting over the heads of the audience would create a cacophony of noise that would be hard to ignore. However, just months before the performance opened, they got their hands on some new laser projectors that could turn on and off silently.

Imaginarium also built its own system for capturing the facial movements of Quartley in real time for a scene during Act III of the play, when Ariel is sent as a harpy to terrify the shipwrecked lords. The team created a contraption similar to a cycle helmet for Quartley to wear on his head during the scene. A metal bar extends from either side of his head to hold a motion-capture camera 20 centimetres away from his face. The team developed make-up for Quartley that would highlight his lips, eyes and other key parts of his face to ensure that the camera could pick up his movements and feed them to the computer system. They also developed algorithms that learned to recognise human faces before training the software on Quartley’s own face over several months as he rehearsed the scene. This allowed the designers to work out what the facial expressions Quartley produced would look like on the harpy.

In a neat reversal, Imaginarium expects to use the real-time facial technology developed for The Tempest in future feature film productions to allow directors to see their actors’ performances rendered onto digital characters on set rather than months later.

Production

The production’s set features an elaborate gauze cylinder in the middle of the stage that uses the motion-capture technology to ensure that projections, lighting and spotlights are synchronised © Royal Shakespeare Company/Topher McGrillis

The performance of Ariel is not the only motion capture used during the play. The digital avatar is projected onto 14 curved gauze screens that fly in and out of the stage while in the centre there is a gauze cylinder filled with smoke known as the ‘cloud’, with each screen tracked using optical motion-capture cameras and software. Bespoke software was created by Vicon – a motion-capture technology specialist – and D3 – a video server firm – to track the screens as they moved around. This data is superimposed onto a virtual map of the stage to help ensure that the images are projected onto the right spot, while also guaranteeing that the lighting software and spotlights work in sync with the projections. This system is also used to create other visual effects seen during the performance, including projecting images of dogs onto drums carried by spirits as they chase some of the characters.

To help the production team each night, engineers at the RSC also built controls into the lighting control console used in the theatre so that they could change Ariel’s appearance using faders. What they created was a sort of ‘avatar mixing desk’, which made Ariel disappear or reappear with an analogue fader, change his colour or appearance, or set him on fire and control how much fire. In the end, much of the appearance of Ariel’s digital avatar was automated with cues at Stratford, but the engineers hope to make more use of the mixing desk approach when the show starts in London.

IRONING OUT PROBLEMS

The final performance is not without its problems. Critics have commented on noticeable lip-synching issues when Ariel sings or speaks and there are also difficulties with the delay that occurs as Quartley’s movements are crunched into bits and then back into images, as the games engine, each video server and the projectors all have frames of latency, which cover the time from rendering to display.

Due to the tight production schedule of the RSC, the cast did not begin rehearsing on the final stage until a few weeks before the play opened and there were still some final problems to be ironed out. The very large metal structures within the theatre building meant that there were dead spots where the suit would not track properly as the gyroscopic sensors reacted to the metal, so Quartley had to remap his performance to avoid them.

A technical team also waits on standby in the wings during each performance, ready to recalibrate Quartley’s suit when he comes off stage between scenes. Quartley also faced other problems in the final days of rehearsals. The graphics team were tweaking and updating the way the digital character moved almost continually, which meant that he had to change his onstage movements as well to get the avatar to do what he wanted.

The real achievement of BIOGRAPHY

Sarah Ellis is Director of Digital Development at the RSC. As a theatre and spoken word producer, she has worked with venues including the Old Vic Tunnels, Battersea Arts Centre, Southbank Centre, Soho Theatre and Shunt.

Ben Lumsden is Head of Studio at Imaginarium Studios. He started working on motion capture on the film District 9, and his film credits since have included Tawny Schlieski is Director of Desktop Research at Intel Corporation. She is a research scientist and media expert in the Intel Experience Group, and her work centres on new storytelling capabilities enabled by emergent technologies.

Keep up-to-date with Ingenia for free

Subscribe