Mindseye
Mindseye is a third person cinematic action game that follows the story of protagonist Jacob Diaz as he tries to figure out what happened to him in his last mission with his "Mindseye" implant.
This was my first role in the games industry and during my time there, I was promoted from Audio Programmer to Senior Audio Programmer
Senior Audio Programmer
Working as an Audio Programmer, my job is to implement systems that can pass information to the sound engine. That could be anything from triggering sounds to providing audio specific data to general game data that can all be used to manipulate the sounds setup by our audio designers.
In the team of 5 programmers, we owned specific systems within the game so most of, the time was spent implementing, improving and bugfixing within these audio specific systems, but we also had to integrate with other systems from around the game. This means that as an audio programmer, one has to be a well rounded game developer to be able to go into any system and adapt it to get the output that is needed for audio
My Systems
The systems that I owned and had a major stake in development grew overtime as we moved away from outsource developers. The following systems are some of the big ones:
Audio Propagation System (APS)
The Audio Propagation System (APS) is the corner stone for realistic audio effects at Build A Rocket Boy. Creating a voxelized representation of the game world, the APS is able to perform raycasts multiple thousands of times per frame
Using this optimized raycasting, the system can then get a picture of the environment the player is in and send data to the audio engine to modulate sounds based on the environment
Most of the things I contributed to this system are low level optimizations/improvements that I can't go into too much detail about, but APS is by far the most interesting system I work on and truly is a massive part of the audio in Mindseye
Audio Reflection System (ARS)
The Audio Reflection System (ARS) is a system that I created from scratch originally to add more detail to the vehicles as an early reflection system that we could use for any sound.
Made as a counterpart to the APS, the ARS used our optimized raycasting system to find reflections points in the world and calculate how much that point would reflect sound back at the listener. This would output a range of values that could then used to modulate the source audio and result in a sound that appears reflected off nearby objects
The ARS has 2 sound modes that affect the sound reflection. A low powered mode that would represent low volume sounds such as vehicles: These sounds would have a limited range, but have a more detailed calculation phase that would result in more accurate reflection data. The second mode is for high powered sounds that travel over long distances such as explosions and weapon fire: These sounds have a much longer attenuation and model the speed of sound for delay effects.
The base of this feature was the ability to dynamically route audio from an object in game to the reflection points which could be 1 or more locations. I developed a plugin for Wwise that would allow this to be controlled at runtime from code while still allowing designers creative control on how those sounds can be modulated. The task of making a plugin within Wwise was a daunting one at first, but once I grasped the concepts of linking the plugins to the game, it became much easier to test and get a good sounding output
Vehicle Audio
Vehicle audio is one of the systems I took over from the out-source developers. One of the main challenges for this system was the use of mass vehicles in the game and having a system that can efficiently switch between a regular player/AI controlled vehicle and a mass vehicle that can switch between LOD levels at any point.
The detail in our vehicle design is very good and there are lots of assets that are loaded when a vehicle is active. This in mind, I needed to make sure the vehicles were within the memory budget so I implemented a system to track and enable/disable vehicles (and load/unload data) when certain conditions were met. This ensured that we didn't have vehicles making noise when realistically they cannot be heard thus wasting resources.
The vehicles team did a great job at making an extensive system to recreate how vehicles work. But for me, this means that there is a lot of data that needs passing through to the audio engine to get a good sound. Working directly with the sound designers was crucial during this phase in order to understand what data they need and what data is available. Having an understanding of the design tool is very helpful for communicating ideas and finding what data is the most appropriate for what the designer is trying to achieve.
Another addition to the vehicle audio I was able to make was to the drones. In my previous job I had a lot of experience with air vehicles and what makes them sound like they do. With this knowledge, I could advise the designers of how the sounds could be elevated and also provide them with extra data from the game to achieve those sounds. My experience with drones in my hobbies helped a lot in getting those extra layers of realism
Music System
The music system was another system that has been in the game since long before I joined the company and had been worked on by a handful of engineers. I was assigned the music system and instructed to re-write it because, as the game gained new features, the music system grew and grew but gained many issues as a result.
We decided to use a state driven design to allow the system to easily switch between game modes and environments. Unreal Engine's State Trees are used a lot in MASS, but in the audio team it had been untouched, so it was a first for us.
Once the system was implemented, we found it was extremely easy to maintain and fix bugs. The state tree lets you clearly see what state the system is in and with that, you can easily theorize why it is stuck in that state or going to the wrong state. I did find that there were a lot of small classes to implement the functions of the tree states and evaluators, but it was for the best having them be small and simple as that makes it obvious as to what they do.
Other Systems I had a part in
- Props (User interaction, destructible objects and foliage)
- Lots of object management with a focus on performance
- Integrating with other teams with regards to swapping assets and handling destruction
- Handling how instancing (BPPs & PLAs) are constructed and automatically setup prop data
- Ambient Vehicles (Unmanned Drones, Mass Vehicles)
- Cars in the game that are at a low LOD still needed to contribute to ambience, but at any point be promoted to a vehicle with more detailed audio
- Drones filling up the airspace around the map are dynamically allocated positional audio when close and send density data when far for ambience
- UI Audio
- Created a simple data driven system that can link UI events to audio event
- Used some of the Wwise plugins described above to provide audio data into the game and stimulate UI
- Ambience volumes that allows a designer to trigger oneshots randomly within the space
- Nodes for the User-Generated-Content tools to give the player access to some of the systems we have access to in the editor
Release Date: 10th June 2025
Platform: Windows, PS5, Xbox Series X/S
Languages/Technology: C++, Unreal Engine, Wwise, Wwise Plugins