The audio production has been split into two main parts or levels. Show Room and Main Stage. The two levels feature differnet spatial audio elements as well as different visuals and audio compositions to fit the scene.
Show Room features a house in the hills level that demonstrates sound occlusion, localisation and some reverb methods.
The audio was produced using the DAW, Logic Pro X. Some DI recordings of acoustic guitar were recorded using a Focusrite 18i20 interface, as well as MIDI drums using Kontakt libraries being programed to form a simple and catchy musical loop to form the 'house band'.
Some traditional mixing/production techniques where used throughout the project. Such as FX (reverb, delay, chorus), signal processing (EQ, Compression) to create the cleanest sound for the house band.
The enviornment is fairly small and has a nice dampened reverb with a low RT60 response. With this in mind i was able to add the FX mentioned above without worry or artefacts effecting the quality of the sound localisation. GRA settings such as the doppler effect wouldnt be the best thing to add when sound sources have FX applied, as they seem to warp in a way that sounds unrealistic.
Some channels are summed into brouder tracks to be used within Unity. For example all guitar parts are sumed into one, and the rest are generally what can be seen in the Logic projects Sum folders. These stereo channels sound really nice in the Show Room house even with the effects.
Main stage is a far more complex orchestral composition. The Main Stage level is designed to demonstrate better reverb methods and animation using FFT. This means that the Logic Project has to have more structure to it so that the transfer over to Unity is a smooth process.
Another folder system is used with a naming convention that transfers nicely over to the objects that the sound sources are going to be attached to within the engine... This convention makes things alot easier when refering to specfic objects within a scene.
I wanted the audio to be individual mono sources this time. As in the real world we hear sources from specific locations or specific points within the surrounding envrionment. The best way to simulate this is with a single point where the audio comes from. Additionally the audio sources need to be as clean as possible due to the reverberation and room characteristic being a core target. Most mixing and balancing of the sound stage will be done physically by moving the sources in the 3D array, or with the Unity perameters. So Logic is being used mainly as a MIDI compositional tool to create sounds for in game purpose.
Being MIDI the audio can be altered in Logic and exported easily to replace the previous work. This makes for a good workflow when editing compositions. Native Instruments, Kontakt libraries were the main samples used to give reasonable realism to the compostions. All FX where taken away as well as some edits to articulations.
Each channel was exported via splitting the stereo out into mono outputs and bouncing the true mono channel to export mono sources with their corresponding name.