Wrigley: Experience 5Gum

Robert Bader

Wrigley: Experience 5Gum

Play video

Experience 5Gum is a multi-sensory virtual reality installation that lets you fly through virtual worlds and shoot thunderbolts from your fingertips.

The experience combines the Oculus Rift, the Kinect, 3D graphics, custom 7.1 sound design, two different forms of scented air, a harness, and a shipping container to create a sensory overload that culminates in letting users experience the sensation of flight.

Users touch, smell, see, and hear their way through futuristic, abstract, and atmospheric worlds. Each of the worlds are based on one of 5Gum’s flavours—cool mint, tingling spearmint, sour & sweet, and bursting purple fruit mix—exploring and interacting with frozen, electric, desert, and liquid landscapes.

The experience is a full technological ecosystem: a server for content, high definition VR graphics, 3D spatialised sound, triggering a scent release, switching on the wind, boosting the sub-woofer and another server for managing cameras, and regulating the flow of users.

The Oculus Rift and the Kinect

The visuals in the virtual experience are based on an OpenFrameworks structure which we heavily modified, creating our own custom rendering pipeline to handle the mix of particle systems, custom shaders, gesture detection, and live interaction through a pair of virtual arms.

UNIT9 - Wrigley: Experience 5Gum

Combining the Kinect camera with the Oculus Rift allowed us to add responsive hands into the live-rendered 3D experience. This adds a natural layer of interactivity. When the user moves their hands, the hands within the experience move as well. Users feel their way around the worlds of 5Gum, altering their environments through movement.

The user floats in zero gravity in the final world. The liquid effect uses code from Brainflight, a project that mapped the brain of a mouse, which was also used as the tornado for Find Your Way to Oz.

For the tech specifics we used languages: C++ and GLSL (OpenGL Shading Language) with libraries and frameworks OpenGL, OpenAL, Vorbis/OGG, and OpenFrameworks.

Inside the Container

3D Sound

Audio is spatialised in three dimensional space for an immersive experience. We use Razor Tiamat headphones to support 7.1 audio and process audio depending on head orientation with custom filters.

Sound specifically designed for each stage of the experience is accompanied by a 1000 watt subwoofer which gives vibrational environment feedback.  The sound system creates low frequency beats which intensifies the experience by sending vibrations through the floor…and the user.


The helmet—which was designed to house audio and visual technologies as well as scent delivery—went through numerous prototypes and trials. Artem Creative Solutions custom built a helmet based around the Oculus Rift headset and DTS headphones.

Building a custom helmet to house both technologies created even more immersive 360º 3D experience for the user, engaging the sense of smell and hearing as well as the sense of sight.


The harness is similar in style to that of a parachute and is attached to the user as they enter the container. They are then hooked to a wire rig—often used in films stunts—which is designed to programmatically hoist users into the air midway through the experience.

Scent Generator

Scent is dispensed from a custom module that consist of scent containers, fans, and an arduino controller, all in a fabricated housing within the helmet. The module is controlled using Serial Connection from Raspberry Pi.

Social Sharing

Webcams set up in and around the installation capture the full user experience. A custom video edit is automatically sent to each user that shows that user entering the experience, gearing up, exploring new worlds, and finally their reaction when leaving the shipping container. The link is optimized to share across all social channels.

The Experience is housed in a shipping container measuring 20ft by 8ft which was segregated into three separate rooms. The first room acts as a holding area for visitors. Here users are briefed and fitted into a custom built harness suited to their height and build. The next room is the experience, and the third houses the server.

The serious hardware came down to sticking the systems together. Raspberry Pi and Arduino boards were used in checking doors, lights, scents, winds, bass vibrations, LEDs, as well as sending timed signals to our stunt-men experts operating a modified winch. Keeping all the systems connected and creating a viable process to recover from potential changes that inevitably happen in any real-world installation was a complex job.

“It looks like a wild ride indeed!”

Digital Pulse


Oculus Rift – The Oculus Rift is the latest in virtual reality technology, recently purchased by Facebook. It uses custom tracking technology to provide ultra-low latency 360° head tracking, allowing you to seamlessly look around the virtual world just as you would in real life.

Kinect – The Kinect is a sophisticated motion capture device created by Microsoft for the Xbox. We wrote specialized code to track the user’s movements with the Kinect and render 3D equivalents in real time.

Raspberry Pi – A credit card-sized single-board computer developed in the UK by the Raspberry Pi Foundation with the intention of promoting the teaching of basic computer science in schools.

Arduino boards – A single-board microcontroller, intended to make the application of interactive objects or environments more accessible.

Tech Used:

C++ and GLSL (OpenGL Shading Language) with libraries and frameworks OpenGL, OpenAL, Vorbis/OGG, OpenFrameworks, Oculus Rift, Kinect, Raspberry Pi, and Arduino boards.

Hardest challenge to overcome:

Bringing users back to reality.


Download the UNIT9 VR Player App

UNIT9 - Wrigley: Experience 5GumUNIT9 - Wrigley: Experience 5Gum