Welcome back for the fourth out of five Sundance 2018 installments in our VR Visionaries series! So far, we’ve taken you to the mean streets of 1980s East Los, into the attic and imagination of a clever girl named Lucy, and out of this world with Felix & Paul. Today, we’re stepping inside the mind of a small town police dispatcher with Dispatch—an emotionally-charged episodic narrative out now on Rift and Gear VR.
When the first three episodes launched in November, we took you behind the scenes with Here Be Dragons to learn more about the project’s artistic direction. In advance of the finale’s release, we sat down with Fire Panda Ltd. Director and Dispatch Lead Developer Nick Pittom to dig deeper into the technical and creative chops that brought this narrative vision to life in VR.
What was it like working with Here Be Dragons? How did that collaborative transatlantic partnership inform the project’s outcome?
Nick Pittom: It’s been fantastic to work with Here Be Dragons. They’re an experienced and professional unit, but also very creatively and artistically aware. My team was able to be hands-off with much of the production side and focus on the technical and artistic visual elements, which are the most enjoyable to my mind.
Fire Panda is based in the UK, while Here Be Dragons is in LA, yet with the technology we have these days to screen share and video call, it’s really no problem to collaborate live. Indeed, most of the people I work with on projects are in different countries, whether clients or contractors. Oculus also has a platform perfectly set up to manage new builds across different channels and to test on different devices: With the time difference, we can deliver a new build at the end of our day and have it ready and waiting at the start of theirs—then we’re back in the office the following morning for the next round of feedback.
Of course, meeting in person is still invaluable—especially for a long project—and I flew across to the HBD offices in LA for the mo-cap session. Here Be Dragons organized the shoot with Ari Karczag of Perception Neuron at his LA mo-cap studio. Perception Neuron is a fantastic and very cost-effective kit, but it’s not without its interesting quirks, so having him and his team on-hand was exceptionally valuable. It was also vital in getting suitably clean results. Meanwhile, we worked with Richard Dorton, a mo-cap performer and expert with decades of stunt experience, along with actors Tyler King and Jessica Troiani. This was all made possible through Here Be Dragons, and to be able to rely on them for these aspects of the production was awesome.
Who did you work with on the music and overall sound design for the piece? How does their work contribute to the overall experience?
NP: As writer and director, Ed Robles worked closely with Matt Yocum, the sound designer. Conceptually, the narrative is driven by the audio, so it made sense for the project to be the same—with Ed and Matt delivering the episodes to us and crafting the experience around that. As with the visual style, this was an iterative process, with the result then informing how the audio might be altered to enhance it.
Certainly the sound design was vital to the experience—the tone and mood, the music, the incidental audio moments. When Franklin, the antagonist, is hammering on a door, it needs to sound as if we’re on the other side. The visuals need to react to that sound, with pulses on the door with each impact. When the door breaks, it’s not just opened, but it shatters into a hundred pieces. The visuals and the audio must work together, more so than on any other project I’ve worked on. Indeed, much of our work was designing how the visual elements would react to the audio. Characters are only truly present in a scene when they’re talking, appearing only as an abstracted form of lines and points.
These systems were created by Jason Storey, our lead software designer, and are driven by audio keyframes via a plugin called Koreographer, itself designed for music visualization. Each key can be individually identified for its effect, and the systems Jason created gave us a great deal of flexibility in the way these were expressed. As characters talk, an incomplete wireframe form appears, the form changing with each syllable spoken. The scenery itself brightens as the character talks. As the protagonist breaks into the house, the scenery flashes to aggressive colors when people talk. Everything in the experience is driven and triggered by these sorts of keyframes, from character animation to visual effect bursts—and Jason’s systems handled all these events.
The strength here comes to the fore in iterative projects such as this. If the timings on the audio change, we simply have to re-time the keyframes and all the other events fall into place. It’s always better to avoid that, of course, but it was useful for Matt and Ed to be able to change timings and for it to be straightforward to alter in the project.
Being in VR, the location of audio is also useful for driving where viewers look or drawing attention to important elements. Being on Gear VR in addition to Rift, we needed to keep things as optimized as possible—and we found that not every element required spatialization, so much could sit on the main stereo mix. Yet truly, the most important element in all this was the voice acting. Ed and Dragons were able to entice Silicon Valley’s Martin Starr (aka Gilfoyle) and character actor Beth Grant to Dispatch. Their involvement elevated the entire project.
VR’s still a fairly new field, and Fire Panda already has some impressive credits to its name. What lessons learned from Technolust, Ghost in the Shell, COLOSSE, and/or Apollo 11 VR did you incorporate during development of this new project?
NP: There’s a pull between the technical and artistic demands of real-time VR, and I’ve certainly learned a great deal. Each project brings new opportunities to try something new, from motion capture (mo-cap) on Apollo 11 to interaction on games like Technolust or exploring how people engage with cinematic details on Ghost in the Shell. Beyond that, Fire Panda developed a medical training simulation for the Royal College of Surgeons in Ireland, and we also developed an experience for a aircraft company looking to sell their planes to airlines. Each brings a new understanding of how people engage with VR and, of course, additional experience with the technical aspects required to produce VR content.
Each also has its own challenges, certainly, when creating something artistic within a technical environment, and of course Dispatch is no different. Yet it’s ambitious in ways I’ve not approached in previous VR work. It’s fundamentally audio-led, with the visual aspects driven by what Ted, the protagonist, hears. He the dispatcher, taking 911 calls and having to figure out what’s going on based on what the caller says and the events he’s hearing in the background. It leads to interesting opportunities. Ted may hear something and imagine something that’s very different from what’s actually happening. You can mislead the viewer, exaggerate certain aspects. Indeed, the very first scene takes place within a world that doesn’t actually exist the way Ted believes it does. That sort of challenge is exciting.
How does Dispatch differ from your earlier work in VR?
NP: Working with Ed Robles, the writer and director, was very collaborative in the way we developed the visual language. More so than with other projects, the visual style absolutely evolved as we progressed. With COLOSSE, for example, it was fundamentally visual in nature, going from a 2D illustrative style into a 3D world, and the concept stayed very much true to the initial vision. The iterative approach to Dispatch gave us the opportunity to really explore how each moment could best be visualized.
My earliest projects were fan creations based upon Studio Ghibli films, with the boiler room from Spirited Away and bus stop from My Neighbor Totoro. The process of taking those illustrated scenes into fully 3D worlds was an important lesson for me in what does and does not work in VR. With COLOSSE, we focused on how the scale of VR—of being in there—allows us to lead the viewer and direct their focus. The construction and layout of a scene and the way events lead the viewer are as important to VR as editing is to film.
Yet due to the nature of Dispatch being constructed in the mind of Ted, we can push this further. Scenes can melt away or shatter, giving way to new locations. Elements can fade in or out as Ted’s focus shifts. Scale is important, whether it’s giant creatures walking over you as in COLOSSE or a small diorama in front of you. In Dispatch, we used scale and distance from the camera to illustrate the intensity of Ted’s engagement within a moment. We really tried to explore every opportunity we could to push these elements as far as possible. We can also exploit that same focus, triggering events as viewers look around, and we do a little of that as well.
How does your background in animation impact your work in VR?
NP: My background is in visual effects (VFX), editing, and animation, which are all very technically-driven skills. Yet it’s their use in storytelling that always made me most excited. Before VR, I wanted to be a feature film writer and director. I was excited about what VFX-focused directors like Neill Blomkamp were able to achieve and very much geeked out on the technical aspects behind the art. I’d made a short film called PROTO, about a cute, mute little robot. It was filmed in a genuine robotics lab, and the CGI robots we produced looked fantastic—even so, film felt limiting in a number of ways. The stories I wanted to tell were too ambitious. Games felt like a possibility, yet it was when VR came along that I finally felt there was a medium where I could tell the stories I wanted. Dispatch is the first project that I feel delivers on the ambition I set out with.
What about your work in the gaming space?
NP: I’ve always loved games, and having the opportunity to work on them in VR was still something that interested me. I co-founded Psytec Games with Jon Hibbins, and since then we’ve released two titles: Crystal Rift and Windlands. Both have some storytelling and experiments, but the main focus was to create gaming experiences that were engaging, fun, and worthwhile in VR. It’s about the end-user experience within VR, and we’ve always sought to ensure our games are as comfortable as possible.
Yet the tension of narrative in gaming is an interesting one. Typically, stories are there in service of the game. It’s rare that the game is there purely to tell the story. There’s some crossover there, and I’m interested in pushing how far each side can go. We’ve been working on Windlands 2 for over a year now, and we’ve seen it as an opportunity to bring in NPCs and tell a much more engaging story. In the first game, we flirted with cinematic moments, with Titan robots looming over the player. With Windlands 2, that sort of thing will be much more fundamental to the experience.
With narrative experiences, I want to push the other way and see where viewer interaction can create opportunities to engage with the story. If the story reacts to you and you have some shared authorship of it, then perhaps you’ll have more engagement within that story.
Why was an episodic format the best way to tell this particular story in VR?
NP: There’s a question of the perfect story length in VR. Will people sit for two hours like in a cinema? What’s too short? TV shows might typically be an hour. Games can be tens of hours. VR is still a new medium, and many of the conventions are yet to be established, so it’s valuable for experiences like Dispatch to experiment with what’s possible.
Being episodic allows the story to be split into smaller sections. I think our experience shows that people will indeed sit for much longer than our individual episodes, although much of that comes down to the pacing and storytelling. It’s also not purely about what viewers expect—but the ability to push the narrative form. The clear break allows each episode to be distinct in style, pacing, and visual language. Episode 2, for instance, is a tense home-invasion story, while Episode 3 is an extended car chase. Episode 4, meanwhile, begins completely differently from all the others and, for the most part, is no longer even set in “darkness” as the earlier episodes are. Perhaps it could have worked as a single piece, but I don’t think the story as we told it would have worked as well as it does without the use of episodic breaks.
Dispatch deals with some challenging subject matter. How did you navigate telling such an intense and, at times, distressing story, particularly at the intersection of art and technology—and in such an immersive way?
NP: It is indeed intense, and it covers some potentially disturbing issues. It was important to ensure that these were handled sensitively. The heavy lifting on that is via the dialogue and its delivery, yet we’re raising important questions, and it’s equally important to ensure they come across through the art. In this regard, the visual style allows the viewer to take from it what they need—to fill in those details.
There’s no glorification of violence, as the violence is obfuscated by the style. Blood appears vividly red, highlighting the horror in Ted’s mind, but it’s also abstracted to floating particles. There are no lingering or voyeuristic shorts. Indeed, while bodies appear, they tend to be fleeting moments and then disappear from view, as if Ted is purposely avoiding thinking of them. The mo-cap was important in this—the viewer is able to recognize and identify genuine human emotion through those movements, to feel a person there. People should feel the tension as someone hides, horror as they’re attacked, and elation as they escape.
It’s up to us to ensure that the technical and artistic needs are married together.
How do you think VR and AR will affect the art of storytelling moving forward?
NP: I’ve always felt that VR is a new storytelling medium. It demands a new approach. You’re inside the story and there with it. It’s more akin to immersive theater in a way, yet with so much more potential as to what we can show, along with the extent to which viewers can become engaged. Dispatch as it’s told would simply not work as well on a flat screen. It would be confusing. You wouldn’t be able to visually parse the detail or forms in the same way. The story wouldv’e been told differently. The experience would be different. It’s much the same as how novels can never quite translate to a visual medium without losing something.
VR adds something meaningful to a narrative experience. I also find AR fascinating in its potential. It brings a story into your world, rather than you into another world as with VR—a subtle distinction, but one that raises interesting questions. If you choose the setting for the story, does that impact the tone? A murder mystery or dramatic story might be at odds with the wrong setting. Does the experience itself know the setting? Does it matter? A different challenge, in any case, but certainly an interesting one.
I'd love to see where further experimentation with interaction and narrative takes us. Interaction brings agency. How might that alter the story? How is that story told? Can you be a character within the story and also keep directorial intent? Where does it stop being a story and start being a game? Does that distinction even matter?
What Remains of Edith Finch is a perfect example of storytelling fitting its medium. It’s a game, but it’s not very “game-like.” It might be better to call it interactive fiction. VR might be perfect for this sort of experience.
What’s next for you? Any exciting projects in the works?
NP: As part of Psytec Games, I’m working on Windlands 2’s story, cinematic elements, game design, experience, and so on. I’m always eager to add as many moments of wonder as possible, and Windlands 2 is a great playground for pushing and experimenting with those. It looks fantastic, thanks to an awesome team of artists. We’ve got plans for two other titles and are going into early stages of planning on those.
Anything else you’d like to share with our readers?
NP: Dispatch has been an awesome project, and it’s great to have it finally complete for people to view the whole thing. Episode 4 is being released in conjunction with Sundance, so it’ll be great to see how people react to it.
It’s been gratifying seeing how VR has continued to grow. Since the consumer Rift and the HTC Vive have arrived, I’ve felt that the hardware for VR is now ready. It’s there—and it’s going to get better. The content is where it’s always been important to see the medium grow, and it’s awesome to see how it has evolved. I never felt VR needed to explode into everyday life for the majority immediately. It simply needed to consistently push in that direction, and so it seems to be doing. As long as I get to keep making projects that excite me, I’ll be happy.
It’s great to meet like-minded developers and artists—and to see their projects—so people should get in touch. This also goes for those with artistic or technical skills who might be looking to work on projects. Anyone with a project they’d like us to work on should absolutely get in touch as well.
Thanks for chatting with us, Nick. We can’t wait to see the response to Episode 4 of Dispatch.
Stay tuned for our fifth and final Sundance 2018 VR Visionaries post—and the release of Dispatch’s finale!
— The Oculus Team