John Rodd Interview. This month we spoke to John Rodd about his background and recording set up, as well as the process that goes behind his work as a recording, mixing and mastering engineer.
John has worked on many top video games including EA Star Wars: Battlefront, Call of Duty: Black Ops II, Mass Effect 3, Assassin’s Creed Revelations, Assassin’s Creed II, and works extensively with Blizzard Entertainment on the World of Warcraft series. RS: For those that don’t know, what is your background and what are some recent projects you’ve worked on? John Rodd recording ELYSIUM at Abbey Road JR: I record, mix, and master music for film scores, television, video games, film trailers, commercials and albums, working in about every music genre there is. My role is to shape the music to match the composer’s vision, and what I bring to the table is a wide background in different styles of music, which helps a lot these days, as unusual music styles, traditional instruments and synths seem to meet more and more. Listening for Loudness in Video Games. I had some time over the weekend for a marathon gaming session and decided to take full advantage.
After a couple hours of play I started to realize how fatigued my ears were getting. “Oh no, not this loudness war stuff again,” you might be thinking to yourself. Don’t worry, I don’t plan on ranting about the upward trend to make things loud. No, instead I want to take a more objective look at the current state of loudness in video games, especially compared to broadcast standards. 9 Sound Design Tricks To Hack Your Listeners Ears. Audio loudness for gaming: The battle against 'ear fatigue' Crytek's audio whizz Simon Pressey discusses why it's important for devs to measure the loudness of their game With the growth of global loudness standards and regulations, audio loudness has lately been a hot topic in the broader video and media arenas.
It might not be as self-evident, but loudness management is also a key factor for game developers such as my company, Crytek – especially as consumers adopt the new generation of consoles that integrate games with other media sources such as broadcast TV, Blu-ray disks, and programs streamed from the Internet. When viewers are switching back and forth between a game they’re playing and another video program, they expect the game’s audio levels to be compatible with the other media types.
Game Audio Loudness and Mastering. Mastering and audio loudness concerning is not a new thing when we talk about music, films, TV shows, commercials, etc etc etc.
But it seems that, when we think about game audio, it is still far from a common understanding. There’s a need of pattern and standards for the industry, so we can avoid the loudness war; and also mega things sound better depending os the platform the game is made for (we must understand better the differences between mastering a game for a tablet, for console or PC, for a smartphone, etc). The good thing is: there’s people talking, thinking and writing good stuff about the theme.
So I share here two of this articles that are really good. Audio loudness for gaming: The battle against 'ear fatigue' IESD-Mix-Ref-Levels-v03.02.pdf. Best practices in using the PS4 Dualshock controller speaker.
Audio loudness for gaming: The battle against 'ear fatigue' The Mix in The Last of Us. KY: Ah, right.
73% of Mobile Gamers Play With the Sound On. One of the best things that come of all the SDK’s out there that developers are using for various functions under the hood of their games is the neat statistics the companies providing said SDK’s can come up with.
For instance, adding Flurry helps developers figure out how people are using their apps, and as a result, both the Flurry Blog and the Flurry Tech Blog are fascinating reads which have served as the source of a bunch of interesting news posts here on TouchArcade over the years. Today’s source of “Huh, that’s neat,” comes from Appington. They have a platform called Amplify which basically makes it easier for developers to add sound to their games. Dennis Gustafsson's Blog - Hail to the hall - Environmental Acoustics. The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community.
The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company. One of our early goals with Smash Hit was to combine audiovisual realism with highly abstract landscapes and environments. A lot of effort was put into making realistic shadows and visuals, and our sound designer spent long hours finding the perfect glass breaking sound. However, without proper acoustics to back up the different environments, the sense of presence simply would be there. To achieve full control over the audio processing and add environmental effects I needed to do all the mixing myself.
GAMEAUDIOMIX. (Top 11 Video Game Mixing Tips) Creating the Spaces of Ambience. Guest contribution by Michael Theiler (Kpow Audio) Situating an Ambience When creating ambiences for games (this applies equally to film), I am striving to make them blend into the background and not mask any important in game sounds.
For most ambiences, these are the most important qualities that I am attempting to resolve. In order to achieve this, I need to firstly focus on the repetition and timing between audio occurrences in the sounds. This means spacing sounds, and adding and removing sound occurrences in my audio sequence. Sprinkling Sounds. Surround Music in video games - is it viable? Senior sound designer Rob Bridgett discusses the aesthetics and implementation. Surround Music: The Emperor's New Clothes?
There has been much talk recently, predominantly among composers, dedicated to the virtues and benefits of surround music in video games. Certainly with the increased memory capacity of next-generation platforms, and a greatly increased install-base of surround sound systems, the prospect of surround score and licensed music seems more feasible than it was on previous generation consoles. While this exciting expansion in the dimensions of the video-games medium offers some fantastic opportunities and new horizons for music in video games, there are also pitfalls which should be carefully considered when designing a game's soundtrack with surround music in mind. Leaving aside any technical aspects of surround music for games (these have been discussed in depth and frequently elsewhere), there are some pertinent questions that need to be asked of when surround music is useful and when it is a needless distraction.
Planning Ahead for Surround Music. An Introduction To 3D Mixing. Mixing is arguably the most important stage of the music production process.
When done properly, one can do everything from add energy and warmth to a sound to completely alter the feel and flow of an entire piece of music. In my next few tutorials, I will be taking an in depth look at the mixing process to help you gain a better understanding of mixing concepts and in so doing, present one possible workflow scenario to help you streamline the process. Good quality HRTF audio for next gen. Wwise HDR best practices. Audiokinetic has released Wwise 2013.1 with many new features, among them PS4 support, ITU BS 1770 compliant loudness meters and HDR audio. We worked with Audiokinetic to develop the HDR feature set over the past year and now that it’s out, I’d like to share some of my best practices that I’ve come up with (so far) in using it: 1). Keep it mellow: The first thing to be aware of is that the Wwise implementation of HDR audio is a relative volume scheme.
We initially played with using SPL, similar to DICE’s Frostbite Engine, but abandoned that because a). we learned that even DICE didn’t use real-world SPL values, which sort of negates the whole reasoning behind using real-world values to set volume and b). because not everyone would use HDR and introducing a second volume slider (Volume and SPL) in Wwise just confused and overcomplicated things. Finding Your Way With High Dynamic Range Audio In Wwise. Guest Contribution by: Louis-Xavier Buffoni – Software engineer at Audiokinetic HDR in a Nutshell HDR (“High Dynamic Range”) audio is a technique which draws its inspiration from the local adaptation method used in HDR imaging, which “attempts to maintain local contrast, while decreasing global contrast.”  In audio, this local/global dichotomy applies to time, and contrast refers to loudness instead of brightness. The technique consists of using an automatic mixing system that maps virtual world loudness to living room loudness.
Clerwall’s phrase “every sound is important, but not at the same time”  summarizes the essence of its algorithm: the mapping is adaptive to what is playing in the virtual world, and can be represented by a “sliding window”, as is illustrated in the following figure. HDR audio has received a lot of attention since it was presented by DICE a few years ago, backed up by their astoundingly good sounding games Battlefield: Bad Company and Battlefield 3 . The Dynamics of Mark of the Ninja. Guest Contribution by Matthew Marteinsson At first listen you can tell Mark of the Ninja is a game with a wide dynamic range. Real-time audio mixing (Wwise+Unity) So much to do and so little time, but alas I swore to continue with this blog. The video below shows a Unity editor mix system for dynamically controlling the volume of grouped sounds via Wwise RTPCs. This builds from previous plug-in work , but now I have an official game project to use it on!
[sinister laugh] The game is still very much in development, but it’s coming along. Real-time audio mixing with Wwise and Unity3D from Roel_San on Vimeo . While the system provides the ability to easily adjust levels of sounds in the Unity editor, it also acts as an active mix system, receiving method calls in-code that change the levels of various sound groups depending on a defined game-state. The Game Audio Mixing Revolution. The Game Audio Mixing Revolution. Game Audio Podcast #10 – Reference & Mix Level Standards. Posted by lostchocolatelab on Monday, August 22, 2011 · 2 Comments Reference & Mix Levels across different media can vary wildly.
Reverb: The Science And The State-of-the-Art. Aesthetic Mixing & Sound Lenses. All In The Mix - The Importance Of Real-Time Mixing In Video Games. Notes from the Mix: Prototype 2. [The audio director on open world game Prototype 2 shares a crucial revelation about how to create a consistent soundscape for his game across all different sections -- and explains in depth how he achieved that great mix.] Crafting the sound of DUST 514 - DUST 514. The Sound of Planetside 2.
Written by Gary Miranda & Rodney Gates (edited by George Hufnagl) Content.yudu.com/A1wd5u/DEVSep2012/resources/content/49.swf. Notes from the Mix: Prototype 2. A large part of the consistency puzzle for the mix was on the voice content side, and this was solved by choosing a single dialogue director / studio for all our major story voice assets. Video Games and Loudness Standards: Interview with Sony’s Garry Taylor.
The issues of loudness and dynamic range are common across all audio/visual media, but recently this conversation has been gaining traction within the game audio community. Loudness Requirements Open The Door To New Creative Potential - News - Pro Tools Expert. Loudness requirements opening the door to new creative potential may seem like a very strong claim to make but recently Jon Schorah, creative director at Nugen Audio wrote a piece which I feel helps to explain how the loudness requirements and liberate us from the loudness wars where we have to crush the life out of our audio to make it sound louder than anything else.
Jon writes… Loudness in Interactive Sound at Sony. Loudness In Game Audio : Designing Sound. Different Loudness Ranges for Console and Mobile Games. Blog: Loudness Targets for Mobile Audio, Podcasts, Radio and TV.