STRINGS

Users place virtual strings in their environment to generate procedural music that adapts to their space and their interactions.

Key Spatial Principle

CONTEXTUAL ENVIRONMENTS

Experiment Duration

8 Weeks

Reading Time

3-4 Mins

THE GOAL

THE 
BACKGROUND

How can we use spatial computing to transform our environment into a canvas for creation? Our curiosity drove us to conduct our next experiment.

We were keen to explore how to use the environment within spatial experiences for something other than gaming (which we had already touched upon in a previous experiment). This felt like the perfect opportunity to see how far we could test contextual spaces of a different kind.

THE IDEA

Instead of a visible reaction, what about an audible one? We set out to create a prototype that’s all about environmentally-impacted audio. Strings acts as a mixed reality musical instrument for users to place and play wherever they like in their space. Dropped into the real world environment, groups of virtual strings vary in tonal values depending on their physical placement - so users can position them in different areas of their room to generate different types of sound.

THE 
EXECUTION

We designed this experiment for Meta’s Quest headsets as well as Apple Vision Pro - not only were we keen to develop one of our first apps for the Vision Pro, but this approach also served as a useful exercise in understanding the capabilities of each headset and how we could approach parallel workstreams.

We used room mesh to allow users to anchor the virtual strings to their environment, and hand gestures to enable them to pluck the strings and generate sounds. The audio is procedurally generated based on the length of the strings themselves, as well as how the user chooses to interact with them - a light pluck generates a short note, while pinching and pulling the string back further before letting go generates a larger audible reaction with a more varied note range. We also introduced an open palm gesture to allow the strings to be treated more like a theremin - controlled without physical contact.

It was crucial that we presented our mixed reality instrument as users would expect it to behave in order to have the most realistic experience possible, including making sure it sounded consistent and satisfying throughout the experience - for instance, as with any stringed instrument, it only makes a sound once a string is plucked and released. Our team learnt a lot about music theory in the process. Once again, we used Bezi for early prototyping - even though it doesn’t have specific functionality to integrate sound, it did help us validate that we needed to keep our string interactions close to real world expectations - eg pluck, pull, grab. We created a separate application to test what kind of audio reactions we could generate based on our input parameters in order to get it just right.

Applying spatially mapped VFX to the experience allowed us to visualise the audio outputs as users interacted, with rippling currents generated from each interaction. These ripples are mapped to the room mesh, revealing a user’s room geometry in a fun and immersive way. It also worked as a nice counter to the physical environment impacting the audio - in turn, the mixed reality audio visualisation transformed the appearance of the physical environment.

We loved the idea that we could use the size and complexity of the room mesh to impact the resonance of the tones as well - for instance, a long hallway would generate more echo and reverb compared to a small, cube-shaped room like a living room. We were able to apply this to the Quest build, although the Apple Vision Pro (at the time of prototyping) did not support this feature.

KEY LEARNINGS

Understanding the nuances between headsets will lead to better design for cross platform spatial experiences

As mentioned, we created this prototype for use across Meta Quest headsets and the Apple Vision Pro. While both are excellent conduits for spatial experiences, the headsets do vary in terms of functionality and tend to do similar things but in different ways - for instance, each has its own way to track mesh, or deal with particles and reflections. We made sure to spend time getting familiar with the way each headset works. Having this knowledge will allow us to streamline the design process even further in future cross platform projects.

/01
Mastering how to use space is key to creating a satisfying user experience

There are many different ways to use a physical space within an experience when it comes to spatial computing, and this time we chose to explore how it can directly influence an audio output. As creators, we’re still in the early stages of understanding how physical space can influence digital interaction design and user experience, testing out ideas via the headset and analysing the outcome. But taking this approach - i.e. transforming a space into a canvas for creation, where the output differs depending on the context - is a powerful option. It allows users to explore their space in a new way, on their own terms, with satisfying interactions.

/02
Considering both realities and how they affect each other opens up new ways to design

Using the physical environment to impact a virtual experience is not uncommon. But working with spatial, we now have the power to create dual impact - with both realities, physical and virtual, able to intertwine and influence each other. In Strings, the contextual placement of the strings impacts the audio, and that audio in turn generates mixed reality visuals that transform the user’s space. Theoretically, those visuals could trigger other effects and interactions as the experience progresses - and so the chain continues. From a design perspective, we can have a lot of fun with this. The possibilities for mutually impactful, contextual interplay between realities is near endless.

/03
/04
arrow left
arrow right

WHY THIS
MATTERS

Understanding how our experience of sound can be transformed through spatial computing is incredibly exciting. The new mechanisms we’ve developed could evolve into a whole new way of creating music or unlocking other forms of creativity, all linked to an individual’s space. This idea of physical space as a canvas for creation is powerful - we can’t wait to explore other opportunities across the full range of the creative spectrum.

LET’S CHAT