top of page

Client

Verizon

Year

2019

Verizon 5G Stadium

To showcase the future of 5G and its impact on the NFL, we partnered with Momentum Worldwide & Verizon to create a unique interactive activation in Miami's Bayfront Park for Super Bowl LIV. The activation synced dome-projected content with a Unity app running on fifty 5G phones within the space.

Take a look at the complete case study here

<Responsibilities />

  • Prototype for digital/real world co-relation.

  • Off-site testing

  • Explore multiple AR approach and decide the best approach for the project. 

  • Overall Unity App structure

  • Unity animations, interactions and transitions.

  • Unity optmization.

<Tech />

  • Node.js

  • Unity3D

  • Git

  • C#

<Client />

Verizon

< Prototyping />

We needed the AR tracking to be as precise as possible because of the challenging environment, so we built multiple prototypes in the early stages of the project to make sure our final approach was bulletproof and ensure that the tracking was on point.

One of the early considerations was using AR Marker embedded on the content, but after prototyping it, we realized that the warp and deformation of the content would interfere with the perspective of the AR content, breaking the experience for the seats closer to the screen.

3DOF vs 6DOF was the main approach to test. For the prototype, we used 3 different approaches. 6DOS + gyroscope data worked when there was static content on the projection, but the motion of "pick up your phone” and the darkness of the dome caused the content to drift a lot. 3DOS + gyroscope seemed to be the best approach, but the gyroscope data was too noisy and unstable, even after filtering this data. We ended up going a simpler route, using only 3DOF and compass data to refine the AR tracking and give a true north to the app that we could monitor and tweak from the backend.

<Tech Approach />

As an integral part of the experience, we synced fifty mobile devices to our dome content, so all the users will get the same AR experience at the same time. To sync our devices, we took in an LTC encoded audio feed from the AV system and decoded the signal with a custom application in openFrameworks. The signal was passed through a custom-built node server that communicates with each device over web sockets, achieving an extremely precise and stable end output.

We developed an impressive approach to AR given an extremely challenging environment. Low light, moving images, and a warped surface are everything you don't want for AR. To compensate for this, we used multiple inputs to orient each device. Here, we used compass data and Rotation data from ARCore (3DOF) as well as a CAD model of our dome, matching each of our real-world seat positions to their corresponding location in our virtual environment.

bottom of page