Warframe: Sound Design

Over the past four or five days, I’ve been designing the audio assets for my Warframe Sound Replacement project. Along with the challenges faced, I also gained the benefit of improving my Pro Tools sessions and workflow by utilising groups and sync points.


When I am beginning to make a sound effect in Pro Tools, I initially make four creation tracks (three audio tracks and one auxiliary track for things like reverb, doubling and pitch shifting, or grouped automation) and one record track which my sound is recorded to once I am satisfied with the effect. This is a proven workflow by audio professionals such as René Coronado from Dallas Audio Post and the Tonebenders podcast (Coronado & Muirhead, 2015). Of course, this is merely a starting point, and as each sound effect develops, it may need more or less tracks depending on it’s complexity or the number of variations of the effect I may need, however I find this is a good starting point.

As you can see below, my missile sound effect has eleven audio tracks and five auxiliary tracks which make up my creation tracks and two more audio tracks as record tracks so that I could record the impacts and the flight of the missiles separately.


This is when grouping your tracks becomes very important, as leaving all of these tracks visible, along with tracks for other sound effects, will become a mess very quickly, and a nightmare to navigate efficiently. By grouping all of my creation tracks together, I am just left with my record tracks at the top of my session, and my creation tracks for one sound effect at a time below them.

You can also see the markers that I have set up, which correspond to different points of the video at which certain sounds in the effect are required. While you COULD manually drag each audio clip one at a time to align with these markers, using sync points and the align shortcut (shift+windows key on PC or shift+option on Mac) is a lot more efficient. Sync points allow you to set a point within your audio clip that you want the alignment to go by. This is particularly useful for audio clips that have a wind-up or leading sound before a transient, such as a creaky door slamming shut. After creating the sync point in your audio clip (ctrl+,), you simply need to select the marker (.<marker #>.), then hold the align shortcut as you click your audio clip with the sync point in it. It’s that easy! This has dramatically increased the rate at which I can work and I wish I learnt this earlier.

The Assets

Now that you’ve read through the boring parts, let’s go through a few sound effects that I have made for the Warframe project. I will be covering my work on the gunshots, the missile sounds, and the vortex sound. The rest of the sounds that I worked on (including the Moa climbing the crate, the corpus falling from the bastille, and the crate being pushed back from the Moa landing) are mainly edited versions of samples that Max and I recorded in the C24 sessions and as such, nothing hugely creative needed to be done with them.

Note that these are preliminary effects, and are highly subject to change as the project develops. Max and I have already discussed some changes to be made in our mixing session yesterday (which I will talk about in a future post).

Missiles. As I have already alluded to in my workflow section, the missiles took quite a few layers to get right, and are still are in the “work in progress stage”, although here is what they sound like at the moment.

Missle Flight Noise EQTo get the Missile Flight sound, I generated a section of pink noise as a background layer, which I then EQ’d to get the sound I was after. The second layer to the flight was a recording of a sparkler that I recorded at the side of my house late at night (so there wasn’t any obtrusive background noise) with my Zoom H5 and their built-in condenser microphones, of which I just used the left channel.

While I think I have the right idea to get started with the flight sound, I still plan to run it through the Waves Doppler plugin on an auxiliary track to get a true sense of motion from the sound, before automating the panning in the mix phase. The balance between the two layers is also a little off, as I think I will be reducing the volume of the noise layer in favour of the more natural sparkler sound. The impact sound, however, I am not as pleased with, and will probably be getting a pretty hefty overhaul, as it needs some sort of shrapnel sound and a more clangy impact, along with a more satisfying reverb.


Gunshots. They’re really hard to do correctly without some degree of experience. This is my second project where I have had to make gunshots on a budget (without going out to a gun range and recording gunshots directly), and while I have improved in some areas (particularly in clarity and giving the gunshot punch), I feel like I also missed the shot at the goal in some others (giving the shots a sense of continuity and flow that was in my Starship Troopers gunshot effect).



Daisy the killer dog

The sounds above are my first and second attempts, and to be frank I wasn’t really comfortable with either, however in my mix session with Max, he suggested using them in combination with one another, which I thought was passable even though it doesn’t completely resemble the sound in the original clip. For my gunshot sounds, I used a combination of recordings, including the broken kick pedal and party popper recordings from my Starship Trooper sessions, and a kick drum sample that I retrieved from a royalty-free site.

You may also notice a squeaky layer in the second attempt, which I added to give it the sound of a smaller weapon such as a silenced pistol from a movie (silencers in real life sound nothing close to that). I achieved that sound by recording my sister’s dog’s pink rabbit toy with a Rode M5 microphone and running it through an aggressive EQ filter, then pitching it up, and then running it through the AIR Lo-Fi plugin to reduce the sample rate and give it a crunchy sound.

Vortex Sounds. I decided to record the vortex in two parts. One recording was my friend’s fidget spinner that I recorded at SAE with my Zoom H5. The other recording was one layer of a Moog Sub37 synthesizer that Max and I recorded in the MIDI studio at the start of the trimester, and the other layer was a recording of me blowing into a rubber tube in our C24 session and rapidly waving it side-to-side. I ran the tube recording through some EQ to get rid of some of the obnoxious highs and lows, effectively performing manual audio repair before trying to repair it with iZotope RX (a methodology recommended by Bob Hein from Harbor Picture Company (Coronado & Muirhead, 2017)) before running it through a flanger at rate of 1.77Hz and then a phaser at a rate of 4.7Hz. I also recorded some samples of me flinging the lid of a braised steak & onion can against the lip of the can to get some sort of sound for the orb opening. This is probably the sound effect that I am most happy with.


I hope you’ve enjoyed this insight into my sound design process, and I will be posting a blog on our first mixing session soon.


Coronado, R. & Muirhead, T. [Tonebenders Podcast] (Producers). (2015, August 4). Tonebenders Podcast: 037 – Matthew Marteinsson [Podcast]. Retrieved from https://soundcloud.com/tonebenders-podcast/tonebenders-037-matthew-martei

Coronado, R. & Muirhead, T. [Tonebenders Podcast] (Producers). (2017, July 3). Tonebenders Podcast: 059 – Bob Hein in NYC [Podcast]. Retrieved from https://soundcloud.com/tonebenders-podcast/059-bob-hein-in-nyc