game audio film sound design
NewsStorytelling

How Game Audio Design can Influence Film Sound Design

Content

    Written by: Luca Spadavecchia

    The PlayStation press conference that occurred in June 2020 showed a preview of gameplay and cinematic scenes of some announced game titles. And the impact was massive.

    Why am I bringing this up? For the answer, we need to take a step back. Earlier that year, Mark Cerny presented the technological progress of the Playstation 5. On this occasion, the PS5 lead system architect revealed that the next console will have a custom hardware unit dedicated to audio: TEMPEST 3D AudioTech at 38:30 in the video.

    As an audio enthusiast and sound designer, I had to process this information. That is why I decided to use this argument as the main topic of my Master of Arts Dissertation. Let me explain how this is relevant to you.

    Fundamentals of Sound Design

    Sound design is a crucial element in filmmaking that enhances the emotional connection to the story. It involves the creation, curation, and manipulation of audio elements to complement and elevate the visual aspects of a film.

    Sound design includes dialogue, foley sound effects, original sounds, ambient sounds, music, and more. It can inform the listener of offscreen information, build anticipation, or create surprise. Sound design can transport us through time and space by recreating the ambiance of different eras and environments.

    The Role of the Sound Designer

    A sound designer is responsible for crafting a film’s sound design. On a small production, the sound designer may be the only person responsible for the film’s audio component.

    On larger productions, the sound designer leads an audio team consisting of various sound design jobs, including foley artists, audio engineers, re-recording mixers, and more. The sound designer’s work is done during post-production, but they may be involved in the film as early as the pre-production stage.

    A professional sound designer must have a deep understanding of the film’s narrative and tone, as well as the ability to create a balance between different audio elements.

    Sound Design Techniques

    Sound design involves using various techniques, including layering, processing, and sculpting sounds. Sound designers can use these techniques to create a wide range of effects, from subtle ambiance to dramatic sound effects.

    For example, a sound designer might use layering to create a complex sound effect, such as a spaceship taking off, by combining multiple individual sounds. They might use processing to manipulate the sound of a voice, making it sound like it’s coming from a different environment.

    And they might use sculpting to create a unique sound effect, such as the sound of a monster roaring.

    What can a Sound Designer add to the New Consoles’ Game Audio?

    The creators of the new generation consoles, PS5 and Xbox series X, announced that the new built-in audio card will allow them to develop more detailed sonic worlds. The consoles engineers managed to solve most of the problems related to sound memory storage, which frees space for sound designers’ creativity.

    This information made me think about the possibilities of analyzing crucial audio workflows in both film and game productions.

    A video game audio engineer plays a crucial role in leveraging the new audio capabilities of the PS5 and Xbox Series X, creating immersive soundscapes that enhance the gaming experience.

    The audio description of a space has a pivotal role in the viewer’s immersion. The more accurate the sounds will appear in the space the easier the audience can pinpoint the position of the sources. But how a sound designer can achieve this?

    Object-based Mixing

    A sound designer aims to create soundscapes that allow the viewer to extend the screen boundaries. There are many techniques to achieve this goal, but the one I want to highlight is the object-based mixing technique.

    It consists of encoding all the sound sources that describe the landscape. These sources have parameters that are registered as metadata. They indicate the specific coordinates to allow locating the objects accurately.

    The audio format delivered in the aforementioned PS5 press conference was binaural. This format allows you to perceive the height of a soundscape. Therefore, the object-based mixing technique, which includes crucial steps of sound mixing, suits perfectly this format since it allows to accurately position the sound sources within the landscape.

    For the first-ever time, the PlayStation users could experience a taste of the TEMPEST 3D AudioTech.

    This made me wonder something: if the gamers are experiencing gameplay in binaural format, won’t they notice the differences when they will listen to different audio formats delivered on most of the streaming websites in the market?

    I do, and that’s why I started comparing game scenarios with film sequences to understand why some of the software and workflows used in-game houses are not yet integrated into the film industry. Can binaural audio become a standard in film production?

    game audio film sound design

    Film and game audio listening scenarios

    Object-based is the technique used in video games to create real-time changes within the sonic world. The game industry avails itself of game engines (i.e. Unity and Unreal) that are able to communicate with middleware (Wwise, Fmod) in order to assign a sound to every moving or stable source in the landscape.

    Whilst this is a standard for game production workflows it is not for films where even if some directors are starting to work with game engines the middleware is still not used. A sound design team plays a crucial role in creating immersive audio experiences in both films and games.

    I want to show you some similar scenarios in which, in my opinion, films got influenced by games mixing techniques. So, grab your headphones, and let’s listen to some examples. Creative sound design is essential in enhancing the storytelling and immersive experience in both mediums.

    Ratchet and Clank: Rift Apart’

    Ratchet and Clank: Rift Apart’s Marcus Smith, during the “Sony PlayStation 5 Reveal Event” affirms that the binaural audio characteristics supported by the new generation console represent a fundamental difference between listening to a television set and playing.

    He says that listening to this new gameplay it’s like going out in the middle of the forest, it brings you into these worlds in a way that you have never been able to do before.

    In Ratchet and Clank: Rift Apart gameplay the use of the object-based mix is evident. The voices in the mix are perceived according to the rotation angle of the main character’s head. This doesn’t just enhance the gamer immersion, thanks to this audio format we can perceive the height of the spaceship passing from above Ratchet.

    The sonic world is described as a three-dimensional space. How cool is that? The sound designer can let us hear exactly what the main character it’s hearing.

    Gravity Film Sound Design compared to Star Wars Battlefront II Gameplay

    Cuaron’s Gravity presents similar mixing techniques in the context of the film’s sound design. At minute 3:50 the viewer sees the astronaut coming from the right of the scene to the left, following his path towards the back.

    Audio-wise, this scene is described as the audience is listening from where the camera is positioned: the astronaut dialogue is tracked with his body throughout his movement in the space. The listener point-of-audition is located in the POV, which theoretically means this is the same mixing technique used in the aforementioned game.

    Sound editing plays a crucial role in achieving these audio effects, distinguishing itself from sound mixing by focusing on the creation and arrangement of specific audio elements during post-production.

    In Star Wars Battlefront II gameplay the perception of the auditory space drastically changes between the first-person shooter mode and the third-person shooter mode, which occurs by moving the camera behind the player to have a wider view.

    When the player chooses the first mode the sound is muffled as the character is in the spaceship. In the second mode, the spectator is always immersed in the sonic world but they are looking at the action from a different POV, hence, they listen to the vastness of the space from outside the spaceship.

    The video below represents the interior of the astronaut helmet; the spectator is, therefore, hearing/seeing what the character is experiencing. We are not only perceiving the auditory space. The emotional status is rendered by the use of a low-pass filter that conveys confusion.

    Gradually, the filter disappears as the camera moves the frame outside of the helmet: here we perceive the scene as a second astronaut, immersed in the three-dimensional space but further from the main character.

    Shadow of Colossus Sound Design

    In Shadow of Colossus, the player apprehends the action from a third-person camera angle. When the character goes underwater the player understands it thanks to the splash sound effects. But if the camera doesn’t go underwater, the gamer will still hear the scene from outside of the water.

    Once the camera reaches the character underwater, the player perceives the ambiance as they were immersed in the water. Hence, The Point-of-Audition is connected with the Point-of-view.

    Blade Runner 2049 Film’s Sound Design

    Blade Runner 2049 has similar action scenes. The one displayed below describes a fighting scene where a character is trying to drown his opponent. How the sound mixer designed the audio follows the same concepts of Shadow of Colossus.

    The Sound Design Process

    The sound design process involves five steps: preproduction, recording, editing, mixing, and mastering. During preproduction, the sound designer works with the director and other crew members to develop a plan for the film’s sound design.

    During recording, the sound designer captures the necessary audio elements, such as dialogue, foley sounds, and ambient noise. During editing, the sound designer assembles the audio elements into a cohesive soundtrack. During mixing, the sound designer balances the levels of the different audio elements to create a balanced audio mix.

    And during mastering, the sound designer prepares the final soundtrack for distribution. Throughout the process, the sound designer works closely with the film’s editor, composer, and other crew members to ensure that the sound design complements the visual elements of the film.

    Conclusion

    The film industry is already taking advantage of game software, mostly for reaching outstanding visual results. The Mandalorian, Spielberg’s Ready Player One, and Villeneuve’s Blade Runner 2049 used Unreal Engine to visually look at the VFX results in real-time through the virtual camera plug-in directly on-location.

    Why the sound isn’t considered yet in this process?

    We have seen how the game experience is not so far from the film one. Now that the game audio is taking a step further, I’m expecting the film audio will react reaching another level.

    Let me help you step up your Game

    This website uses cookies. If you continue to visit this website, you consent to the use of cookies. You can find more about this in my Privacy policy.
    Necessary cookies
    Tracking
    Accept all
    or Save settings