RIDERS

Creator and Rider come together in a shared experience that uses asymmetric gameplay and scale to create a feeling of connectedness and co-presence.

Key Spatial Principles

CONNECTED CO-PRESENCE

Experiment Duration

3 Weeks

Reading TIme

3-4 Mins

THE GOAL

THE 
BACKGROUND

How can we use spatial computing to bring people together in new ways? This was the key question driving our first experiment.

We set ourselves the goal of designing an experience focused on creating moments of meaningful connection between physically remote people, bringing them together in a shared space in a way that felt authentic yet unique to mixed reality.

THE IDEA

There are two types of shared experiences: symmetric, where people interact in parallel, and asymmetric, where people engage in distinct experiences. In our experiment, we explored asymmetric mechanics, where one person is in Mixed Reality (MR) while the other is in Virtual Reality (VR) or on another surface, operating on different scales and participating in varied interactions within the same environment, yet still connected.

Two core themes drove our ideation: the notion of playing god, and gaining new perspectives on the real world by experimenting with scale and viewpoints.

In Riders, one person acts as Creator while the other is a Rider. The Creator draws a custom path for the Rider to follow along. The scene is set in an asteroid belt, far out in space.

THE 
EXECUTION

One of the most compelling aspects of spatial computing is the ability to share a virtual space with a friend that’s representative of your surroundings, even if you are miles apart. In our experiment, sharing the room geometry of one person’s environment with the other and leveraging microphone access to allow them to communicate meant they could feel like they really were present together rather than just playing in an environment totally out of context.

We played with elements that reinforced asymmetric gameplay, like varied interactions and scaled perspectives; each player has a different POV, so the Creator is supersized as they lay out the route upon which the miniature Rider cruises. 


In order to mitigate the motion sickness that some people experience in headset experiences, we leveraged the “cockpit effect” by creating a windowed boundary around the Rider’s viewpoint, as well as balancing the speed and scale of their movement.

We opted for a ‘build-first’ approach - the idea that everything is a hypothesis until it is validated in device. Multiplayer functionality was incorporated by using Photon Fusion, an API that optimizes network communication between players to allow us to bring our dual-experience concept to life. We used full body tracking to give the virtual avatars the most life-like body movements possible, and we also upped the level of immersion by casting real-time shadows of the voxels and VFX onto the user’s physical environment for a more realistic effect.

Our experiment focussed on Riders being a two headset experience, but it could also evolve into a cross platform game between VR and mobile where the Creator can join and draw the track on their smartphone for maximum accessibility and asymmetric play.

KEY LEARNINGS

Take advantage of limitations and exploit them

We quickly realised that the room mesh geometry had its limitations, where the mesh is more like a rigid cloth laid on top of your environment rather than a perfect 1:1 replica - depending on the environment, the mesh could be nice and clean or heavily filled with jagged shapes.

We used a layer of fog to prevent the user from seeing too much contrast in the room mesh geometry. This softened the scene just the right amount while still maintaining a recognizable space for the participants.

/01
Co-presence doesn’t have to follow the rules of realism

A simple scale change in this prototype makes a massive difference in perception and expectations. We put players in a surreal situation and still achieved a sense of closeness and connection between Rider and Creator in a shared space.


The fidelity and quality of the room meshes still have a little way to go before we can believably transport users to a fully realistic environment. But the abstract feeling of the mesh actually turned out to provide some cool creative opportunities for how we represent the room in shared spatial experiences.

/02
New user behaviours and interactions are emerging within spatial experiences

During testing, we noticed that players not only communicated via the open microphone, but also engaged in interactions that correspond with being physically present - such as waving and shaking hands.

The spatial nature of the experience allowed players to feel like they really were sharing an environment with each other, with the level of connection going beyond that conventionally associated with digital experiences.

/03
A differentiated workflow needs to be created in order to effectively build for users in different scales

Introducing players at two vastly different scales presents UX & Tech challenges. Particularly because Photon Fusion, used to develop the multiplayer aspect of the game, is designed to transfer 1:1 representations of objects. But we developed a system that disrupted this workflow, scaling the player properties before they were sent to and from each headset in order to achieve the scales we wanted.

/04
arrow left
arrow right

WHY THIS
MATTERS

In an age dominated by technology, it's easy to get swept away by the allure of innovation and progress. Yet, amidst the whirlwind of advancements, it's crucial to remember a fundamental truth: technology is nothing without human experience.

We believe co-presence will become one of the key differentiating factors of technology in the next decade. Beyond gaming, co-presence will open up opportunities to elevate and improve shared experiences be it entertainment or utility. There’s still a lot to discover but it’s important to start understanding the foundations of this new language to untap its potential and evolve with it.

LET’S CHAT