What’s Changed for Game Sound Design with VR?

Matthew Wearing the HTC Vive

The big thing in VR is that sounds aren?t limited to simple left/right pan. Each component of a sound can now have its own 3D-locale, with its own reverb and other effects.

Back in my game-startup days at a venture called ?SolidState, Inc? in the mid-90?s, I composed the music and designed the sounds for a game called ?Dark Fiber? (Oh, and I wrote a lot of the game-play engine code including a movie scene navigator, a 360 degree scene view, a Doom-like BSP engine, and a game level designer). I had a blast being able to wear so many hats and reveled in the constant discovery of new abilities as a 20-something Gen X-er.

The culmination of the game was a 60 second sequence of the destruction of a skyscraper that housed the business of the antagonist. I pulled together a ton of sound effects from the General 6000 series, and an early East-West sound effects library. I also had been recording my own sounds to use in the game. The backing music had been recorded the week before and was all ready to go. I spent a Friday evening walking through all the cue points for various sounds in the closing sequence, and assembling a cue list that two of my co-workers assembled during the course of the weekend. By Monday morning it had all come together and the result was deemed ?commercial quality? as we watched the final result.

It was awesome.

Sound Design for VR at A3E

So this year, I decided to attend a ?Essential Game Audio Skills? class at NAMM hosted by A3E to check in on the latest developments. Interestingly, not a whole lot of the core process has changed. The tools have improved a lot, and you now have the ability to mix unlimited tracks on a stock PC/Mac (in the 90?s I was running 24 tracks of ADAT for the game, and 6 tracks in Deck II running on a $1000 AudioMedia card housed in a Mac Quarda 660 AV).

The biggest change I see is that there are now game engines where the software development is mainly scripting for game play, and the media management is extremely well done. That makes sense, because the biggest surprise to me back in the 90?s was just how much effort had to be put into the media. I have been creating projects in Unity recently, and am very impressed with how easy it is to set up VR projects and create scenes that come alive.

So that brings us to VR sound design. 

What is added in VR Sound Design?

The big thing in VR is that sounds aren?t limited to simple left/right pan. Each component of a sound can now have its own 3D-locale, with its own reverb and other effects. For example, if you had a scene with a car, you might want to create a separate sound source for each wheel, and for the engine. As you move around the car, the placement of these sources solidifies the illusion of the VR space you have created. If this car was in an enclosed space, like a garage, you would also want to apply some kind of spacializer filter for each component to add a realistic sense of being in the garage.

I will dig more into the process of VR sound design in a future blog post. For now I am just cracking open the Wwise audio tool, which integrates nicely into Unity. I am looking forward to seeing what possibilities are opened up by this tool.

Until then, thanks for reading!

1 Comment

  1. Pingback: Sound Design in Wwise - nuFrontiers

Leave a Comment

Your email address will not be published. Required fields are marked *

*

%d bloggers like this: