QuestRift SGoApps & GamesCommunitySupport
VR Visionaries: Jérôme Blanquet
Oculus Blog
|
Posted by Oculus VR
|
July 26, 2017
|
Share

Welcome to the second installment of VR Visionaries, where we shine a light on innovative immersive experiences and the artists behind them. Today, we take an in-depth look at Alteration—a short VR film that explores AI and digital vampirism, now available on Rift and Gear VR—with director Jérôme Blanquet.

What initially drew you to VR as a creative medium?

Jérôme Blanquet: The first time I tried VR, its power of immersion was a sort of revelation. VR literally allows us to be transported into another dimension. To me, it felt like a whole new way of telling a story. It has an enormous potential. It’s more immersive and more individual.

On the other hand, VR is a technical setup that isn’t suitable for all storylines.

I was very quickly frustrated with VR demos. I found it frustrating not to be able to interact with the surrounding environment, having to follow a movement or stay rooted to the spot, for no particular reason.

That’s why, in Alteration, I chose to place the storyline in the subconscious state of deep dream. I guess this is what sounded appealing at first to some of our early partners like ARTE, who really pushed us to make this project happen.

What was it about dreaming that you felt resonated with VR?

JB: Dreaming, by definition, implies that you become a simple observer of your own imagination and memories. I was also very interested in depicting the visual essence of a dream: elliptical visuals made up of blocks, broken up, disconnected.

If we draw a line from there, VR and dreams obviously have a lot more in common. Both feel real—they engage and move us. So we could say that in VR, as a director, I get to replace the audience’s subconscious with characters, settings, and lights. Pretty exciting, right?

Absolutely. Can you tell us a little about the film’s plot and how dreams come into play there?

JB:Alteration is a story about the possible future of human and machine’s interactions—and more precisely about how our dreams and emotions, which make us human, are assimilated and interpreted by an artificial intelligence. Through Elsa, the film’s AI, scientists are trying to map out patterns in human emotion behavior and the different stages in mental development to humanize her.

Is it coincidence that the film features an AI antagonist and you used artificial intelligence in post-production, or was the echo there intentional?

JB: In the movie, you’re invited to follow Elsa as she gets into the dreams of a test subject, Alexandro. Every time Elsa appears, the dreams are globally altered, transformed. Now how, as a filmmaker and as a human, can you possibly imagine the types of changes an AI would make to some very picturesque dreams? The only solution that made sense for me was to use real AI and the tools of deep learning.

So we had to try and find the best team to help us. Okio Studio contacted the Facebook AI Research (FAIR) team at that point because they’re considered to be one of the most advanced specialists in visual AI research. We consider ourselves lucky that Antoine Bordes from FAIR Paris was instantly seduced by the idea and dedicated some of his workforce to solve our problems.

What was it like to work with Facebook AI Research (FAIR)? How did that collaboration inform the project as a whole?

JB: Working with FAIR made us realize that AI is now mature enough to be used as a creative tool for filmmakers. We basically asked the neural network to interpret the images we shot and then let its imagination run free and even modify the visual treatment.

Prior to shooting, we discussed the different deep learning tools and which workflow we were going to follow. After filming, we started to experiment with the machines. After many tests, I realized that, for any viewer, 360° and stereoscopic immersive effects can tend to be overwhelming when combined with deep AI visual effects. I finally focused on one methodology called style transfer, which consists of recomposing images in the style of other images—something FAIR would explain better than me.

After going through the different possible rendering effects, we were finally ready to start the many iterations FAIR’s Piotr Bojanowski, Saint George Studio, and myself. Saint George handled the compositing step with style transfer, and I worked with a fantastic team made up of Johann Roche, Geoffrey Pons, and James Senade.

Did your use of style transfer reinforce the film’s plot and themes?

JB: Yes. For the reference image, I chose the painting of Nadia, Alexandro’s girlfriend, that we can see on the set. I wanted to symbolize that Elsa, the AI, had replaced Nadia. To achieve this, Elsa digitized the environment through Nadia’s own emotions and perception (represented by the paintings).

As a work of art, the part of the film recreated by FAIR can be interpreted as a very visual and primitive form of dialogue between humans and machines—a dialogue about what it feels like to love and to suffer from a romantic relationship.

How has working with AI informed your thoughts on the subject?

JB: It’s fascinating to imagine machines reproducing human intelligence patterns. I’ve always been interested in neuroscience and how cognition works. Now, if a machine can think in a similar way as we do, does that make it intelligent, or does it mean that we’re predictable? What kind of mechanical patterns, predetermined movements, are we actually made of?

To go further, if knowing is a mechanical process that humans and machines share, artificial intelligence may substantially improve our lives. It could answer a lot of our basic needs and desires—maybe even before we even have them, which sounds both great and puzzling. But again, we have to think about what’s next. After intelligence, one day we’ll inevitably try to map emotions. How AI tries to understand emotions is one of the central themes in Alteration.

Where do you think VR will take us next?

JB: What I’ve learned working with Okio Studio is that you have to be technically knowledgeable and realistic with what you can achieve in a reasonable amount of time. VR is definitely a category of entertainment on its own that will probably take us on some incredible adventures thanks to technologies like EEG or volumetric capture. I also believe that a new form of immersive entertainment will emerge mixing VR, AR, actors, and force feedback. I would love to be there already.

I think the main barrier to having this already, aside from technology, is ourselves. We are unconsciously the architects of our own difficulties when it comes to writing stories in VR. We tend to think of VR experiences like narratives and create too many linear experiences. In other words, we build roads to bring people in, whereas future VR storytellers will be the architects of entire cities to explore—different universes full of life and stories that will develop in a thousand different ways.

What’s next for you? Any exciting projects in the works?

JB: We definitely want to see Alteration evolve towards an immersive entertainment installation. The experience will start with you seated in an armchair. Scientists will place a dermal bracelet on you and electrodes on your head, as if you were really undergoing the experiment carried out on Alexandro. So your own cerebral activity will affect the visual effects of the film in real time.

Release date: January 2018. Who’s in?

Thanks, Jérôme—we can’t wait to try the mixed reality installation!

In the meantime, step inside Alexandro and Elsa’s world on Rift and Gear VR today.

— The Oculus Team