RENZO REYES  
COMPOSITING + ANIMATION


Pokémon AR/VR



Role: Unity Artist , Compositor
Software: Unity, Nuke
Created at We Are Royale


In late 2017, Royale decided to take things beyond the usual :15 sec spot and explore the possibilities of AR and VR. The Pokemon brand was the perfect candidate, therefore we crafted a full digital experience based on one of the latest animated spots we had produced. That’s how Pokemon Augmented Experience app was born.



This is the original :15 second animated spot that inspired the AR/VR experience.


---------------------------

AR Experience


Prior to this project, I had already been playing around with and learning Unity on my own time. By the time this came along, I was ready to take on the challenge and I gladly joined the team as a Unity Artist for the AR portion. 


The AR experience took full advantage of the Trading Cards. We used the cards to bring Pokemon characters to life and battle each other right in front of you!  The card worked as a trigger detected by the Vuforia plugin inside Unity. We worked alongside a Unity developer who was in charge of keeping things moving smoothly behind the scenes in the app build.

My assignment was more focused on the artistic side. Since my background is in compositing + animation, this was the perfect task for me. I had to work with our Maya Animation team to bring the models and animations made in Maya into Unity and of course make them look good! That included textures, lighting, fx and arranging all the animation takes for the idle and activation states. Pokemon Idle when the Pokemon first appears on screen (card is detected). Pokemon Activation (ready to battle) if another one is detected nearby. We always tried to keep things optimized for mobile devices. It was definitely a learning experience!



Another aspect of the AR experience involved using the poster keyart to bring the characters and worlds to life. Think of it as a window into a different dimension. The poster was based off the :15 sec spot and featured Solgaleo and Lunala. Knowing what was coming ahead,  I created a few Unity scenes as proof of concept. These scenes allowed us to determine what assets could be repurposed from the original :15 sec spot and what would need to be optimized for the mobile app.



An early test showing how the poster artwork triggers the Lunala character along with a very rough environment around it. This was also used to judge the accuracy of the AR tracker.




Work in Progress capture of the AR Poster in action (2x speed). No animation yet (just an idle state loop) but it does show some texture, lighting and atmosphere work.



Close to final version. Animation for idle and active (both characters) states has been added.  Transition and idle FX around poster frame are almost final.  This version also shows a matte painting that was added as a background.



Quick snapshot of the Unity scene file.



Screen Capture of the final Unity Scene (looped). In the actual app, only a section of this was visible because of the picture frame. Since I was not inside a typical compositing package like Nuke or AE, I had to come up with ways to make the scene look pretty without slowing down performance (adding post effects). My solution was to color correct the actual textures in Photoshop so it would give the illusion of this very rich and colorful world. Adding a few rim lights in the scene also helped.



This is how the final product looked through a mobile device. The poster would trigger the AR experience, allowing the user to view into another dimension. Even turning the camera a few degrees would reveal new parts of the world thanks to having parallax in the scene.


--------------------------

VR Experience

Last but not least, we decided to include a 360 VR Stereo experience. The initial idea  was to fully build it in Unity. Having it play in real time inside a Unity player would allow us to craft a fully interactive experience.



Proof of concept showing a rough Pokemon scene in full 360 goodness. This was a quick screen capture of our Unity scene. Unfortunately this test does not show the camera walking around the scene. The Unity scene was set up so the camera could walk around (or run) and collide with any objects,


Unfortunately, due to time constrains we decided to create a prerendered 360 VR Stereo movie instead. That meant using our usual tools like Maya for 3D and Nuke for Compositing and finishing. We rendered full blown EXR sequences for both eyes (L/R) and proceed to composite this in stereo inside Nuke. The good thing about doing a prerendered 360 movie is that we could use the techniques we would typically employ in a regular commercial pipeline. 


Final 360 VR Stereo experience we created using prerendered content. This movie would be viewable inside the app using any cardboard device. (Final movie coming soon!)

Cargo Collective, Inc.
Los Angeles, Calif.