Rendering Game of Thrones for the Oculus Rift

(Interview performed in January 2015 for Seekscale company, a cloud rendering startup)

Game of Thrones is all the rage lately, so is the Oculus Rift, so when Framestore and Relevent put up a show where fans could experience Westeros with the Oculus Rift, the buzz skyrocketed. As it turned out, there was a real-time render farm involved in the system, so we reached out to Mike Woods from Framestore, in order to understand how a real-time render farm for an Oculus Rift environment looks like, and how seamless rendering between Unity assets and tv show assets was achieved..

– First of all, we would like to confirm the setup: people could experience Castle Black elevator on the Wall, by wearing an Oculus Rift, in a re-created environment with Unity, and footage and effects merged in the Unity-generated environment, and a stack of render nodes running real time. Did we get that right?

Yes, pretty much. We took some distant environment assets from the actual show, and used the show and set as reference, but actually completely built Castle Black, the elevator, the wall, Wildlings, and surrounding area in Unity.

– When you went with the idea of merging some of the show assets and footage in the Unity-rendered universe, was it because no game engine is designed to leverage a real-time render farm to get a realism boost?

Kind of. We’ve found game engines to be way more powerful beasts than some may suspect. We didn’t have the limitations of having to hit domestic hardware, or encompass 40+ hours of gameplay onto a disc. We used giant custom build hardware to allow us to pack the environment with detail, use dynamic lighting, write our own shaders, and render the whole thing at 60fps and 4k.

– About this merge of Unity universe and TV show assets and footage, how do you even approach such a requirement conceptually? Could we consider each footage and its piece of render farm as a “mini game engine” for a particular portion of the universe? How do you make sure the transition with Unity-generated content is seamless?

Good question. We take it on a case by case, project by project basis. We have developed our own technique of importing both rendered and video assets into engine so that we can blur the lines with what you see, and what the engine actualy has to render. Video/alembic playback is far cheaper in terms of performance with some assets, particularly moving ones.

– How do you choose what is rendered via the game engine and what shall be real-time rendered footage?

As above. Case by case, and the visual target required.

– Do you focus only on realism? Is it even possible to be artistic in real time?

Absolutely. But we have a mini obsession here at Framestore with absolute photorealism. We love stylised animation, and look development, but the hunger to create a truly holodeck experience is also massively exciting for us. The thought of transferring our skillset that alowed us to create a film like Gravity, which was 95% photorealistic CGI, to VR is a holy grail. We are ideally positioned to be the place that can take you to somewhere where you are not. And that dwarfs entertainment as a life changing proposition.

– In a recent interview you mentioned custom servers. I guess low latency was a core requirement. How custom were those servers?

Very custom. Each experience/elevator had a bespoke machine build, with a power supply & graphics card combination that was actually quite tricky to get right (ie, a reliable, never breaking, never overheating set up that could run 16 hours a day without a break). And then add to that the entire show control openFrameworks set up to drive the rumble packs and wind machines in perfect sync with each users experience. Its quite an unprecedented ask.

– The render farm rendered frames at 60FPS, in 4K. Why those particular specs? Is there some kind of “realism threshold”, or is it simply the max quality currently achievable in real time?

Well, Unity & Titan Black GFX cards were our render weapons of choice. “Realism” for VR is low latency ultimately. You can make something utterly photoreal completely useless if it can’t perform at 60fps+, and sub 20ms latency. That’s a balance we always need to strike. It’s much better to have a smooth 360 experience than have a fire effect look slightly more believable. We have to really concentrate on the balance between performance and visual optimization on every project.

– How big was the render farm? A couple of years from now could we see hardcore fans buying an oculus and 4 or 5 PS4 to enjoy a similar setup?

“Merely” a very expensive, custom build PC housing Titan Black per experience/elevator. But its changed hugely already in just the last 6 months. We have NVidia 980’s now for instance.

– Do you see the Oculus Rift impact the way TV shows are consumed in a near future?

Potentially yes. Its difficult to know what the market will be. Too many people are obsessed with shoehoring existing media ideologies into it. New genres and methodologies need time to develop. These wil sit alongside TV, Cinema, social media, Gaming etc etc etc.

We’re not interested in shoehorning. This is year zero for us. We want to explore what people find most gratifying in 360. It might be that linear storytelling doesn’t work at all.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s