background preloader

Inaudible High-Frequency Sounds Affect Brain Activity: Hypersonic Effect

Inaudible High-Frequency Sounds Affect Brain Activity: Hypersonic Effect

Psychoacoustics: Where Sound Meets Your Brain In this article, we’re going to have a close look at the tool we all use every day: the ear. This small organ has quite a few surprises in store for us. We’ll see that it’s literally crammed with equalizers and dynamic compressors, including a multi‑band one. It even includes an extremely efficient filter bank, as well as a highly sophisticated analogue‑to‑digital converter. Armed with this knowledge, sometimes referred to as ‘psychoacoustics’, we’ll discover numerous practical consequences for music production. Note that this article won’t attempt to cover psychoacoustics in its entirety. The Recording Studio In Your Head We’ll start our study of the ear by looking at Figure 1. Figure 1: The morphology of the human ear (diagram derived from Chittka L, Brockmann A (2005): Perception Space — The Final Frontier, www.plosbiology.org). What we call ‘sound’ is in fact a progressive acoustic wave — a series of variations in air pressure, spreading out from whatever source made the sound.

Measuring sounds in Three Dimensions To explain how we can locate sound sources fixed in space from any observing location, we must diverge a little and consider how sound waves from live sources actually behave. Point sound sources make waves in the air that radiate spherically outward, much as ripples radiate outward from a stone thrown into a pool. Using this pool analogy, ignoring its essentially two dimensional nature and any reflections, if you only saw the waves in the pool at a certain time after starting out from the source, could you work out where the source was – where the stone hit the water? The answer is yes. At any location and at any observation time (before reflections), we know that the direction of travel of the ripples in the wave front as it expands disclose the source direction. We also know that the height or intensity of the wave drops as it radiates out. This simple analogy can be extended to sound waves radiating spherically from a source in air.

The Physical Principles of Sound An introductory guide to the physical properties of sound and a basic introduction to the acoustics of enclosed spaces. To aid the understanding of any technical matters relating to sound, as often the case with any discipline, it is crucial to understand the fundamental scientific principles of the subject and how they are commonly interpreted. This guide offers an introduction to the basic physics of sound including the build up of sound waves and their properties, the speed of sound, how it is shaped in acoustical environments, and how treatment can be applied to listening rooms. This document does not, on the whole, provide advice, but presents information for reference with which to aid an overall understanding of sound in a practical working environment. The Physics of Sound Overview At its most stripped back level, sound is the mechanical disturbance of a medium, either gas, liquid or solid. Sound has three stages which affect how it is perceived by a listener. Diagram 1 Diagram 2 Phase

Psychoacoustics: Faking the Space When building the space that sounds occupy, it may sometimes seem like overkill to load up those DSP hungry (though wonderful) reverb plug-ins. They can be a pretty big load when it comes to even the mildest amount of spatialization, and it can also be time consuming to configure one to emulate an outdoor space. I thought it would be worth sharing a trick of mine for both situations. The laws that we’re going to be using are a combination of biology and psychology, with a touch of physics. So let’s talk about what happens when we hear a mixture of direct and reflected sounds. Let’s begin with the numbers that are significant to the perception of reverb. An effect that occurs when he hear a mixture of direct and delayed sounds is comb filtering. So, what about that range between 10 and 35ms? Now let’s get into the meat of it. Naturally, there are a few cavéats that we’ll need to note. Car A panned hard left in a stereo audio file with no spatialization…on SoundCloud 1. 2. 3. 4. 5.

Creating the Spaces of Ambience Guest contribution by Michael Theiler (Kpow Audio) Situating an Ambience When creating ambiences for games (this applies equally to film), I am striving to make them blend into the background and not mask any important in game sounds. For most ambiences, these are the most important qualities that I am attempting to resolve. In order to achieve this, I need to firstly focus on the repetition and timing between audio occurrences in the sounds. Sprinkling Sounds The first task I need to do to ensure the ambience retreats into the background is to select the correct sounds. Removing anything that pops out is a balancing act. Frequencies The next consideration when building an ambience is the frequencies, their relationship, any build-ups of particular frequencies, and the overall mix (which actually comes last). The final element, often the most time-consuming and probably most important, is getting the space right for these sounds. Spatialisation Setup Number One And that’s basically it. Tools

Auditory Perspective: Putting The Audience In The Scene (CC BY 2.0) OiMax By Karen Collins Adapted from a forthcoming article in Animation: An Interdisciplinary Journal An often overlooked aspect of sound design is the use of sound to create a sense of identification for the audience. Auditory perspective is constructed by a variety of techniques that create or reinforce the physical sense of space for the listener through the use of spatialized sound. A number of acoustic properties and effects determine the ways in which sound propagates in space. Figure 1. By using reverberation effects on sounds, we can mimic a real, physical space, and put the listener in that space. Microphone Placement This sense of distance and space can be created in part through the use of microphone placement. When the microphone is at a distance from the emitter of up to about one foot, it is considered close miking, which emphasises the direct signal over the reflected signals. Figure 2. Figure 3. Digital Signal Processing Effects and the Mix Figure 4. Figure 5.

9 Psychoacoustic Tricks When talking about improving the perceived quality of our productions and mixes, its easy to focus mainly on the technology we use, or the layout and acoustic treatment of our listening environment. It’s easy to forget that just as important as our software, equipment and studio is what happens to the signal after all of that, once the sound has made it into our internal hearing system – our ears and brains – and how that sound is actually experienced. So that’s what this article is about: some tips and tricks for operating and manipulating your listeners built-in ‘equipment’! “If real is what you can feel, smell, taste and see, then ‘real’ is simply electrical signals interpreted by your brain…” This basically falls under the heading of psychoacoustics, and with knowledge of a few psychoacoustic principles, there are ways that you can essentially ‘hack’ the hearing system of your listeners to bring them a more powerful, clear and ‘larger than life’ exciting experience of your music. 1.

How We Hear Pitch When two sounds happen very close together, we hear them as one. This surprising phenomenon is the basis of musical pitch — and there are lots of ways to exploit it in sound design. Emmanuel Deruty Films and television programmes consist of a series of individual still images, but we don't see them as such. Instead, we experience a continuous flow of visual information: a moving picture. A comparable phenomenon is fundamental to human hearing, and has huge consequences for how we perceive sound and music. Perceptual Integration The ear requires time to process information, and has trouble distinguishing audio events that are very close to one another. Take two short audio samples and play them one second apart. These two thresholds are no mere curiosities. Changing a single parameter can radically change the nature of sound(s). Two Become One In audio production, there are lots of situations where you can take advantage of this merging. Creating Pitch AM Synthesis: Tremolo Becomes Timbre

How The Ear Works The most important pieces of hardware in any studio are the ones on the sides of your head. We look at the amazing journey your music makes on its way to the brain... Emmanuel Deruty In this article, we're going to have a close look at the tool we all use every day: the ear. Note that this article won't attempt to cover psychoacoustics in its entirety. The Recording Studio In Your Head We'll start our study of the ear by looking at Figure 1, top right. Figure 1: The morphology of the human ear (diagram derived from Chittka L, Brockmann A (2005): Perception Space — The Final Frontier, www.plosbiology.org). What we call 'sound' is in fact a progressive acoustic wave — a series of variations in air pressure, spreading out from whatever source made the sound. This extraordinary signal path encompasses four distinct states of information: acoustic, mechanical (solid), mechanical (liquid), and electric, more specifically electro‑chemical. Figure 2: A complex set of audio devices! Conclusion

Ambience and Interactivity [We are currently having a few problems with page formatting and embedding SoundCloud links, please bear with us until we fix it!] ‘Ambience’ is a word with a broad definition. It is perceived differently and can mean different things to different people. Continuing on that train of thought, I dug into the world of ambience in interactive mobile applications with: Yann Seznec: Musician, sound designer, artist and founder of Lucky Frame – designers of award winning iPhone apps – Pugs Luv Beats, Bad HotelPeter Chilvers: Musician and software designer, best known for the series of iPhone apps created with Brian Eno – Bloom, Trope, Air, ScapeRobert Thomas: Interactive composer and CCO at RjDj – The Inception App, Dark Knight Risez Z+, DimensionsStephan Schütze: Composer and sound designer, Director of Sound Librarian and creator of the iPhone app Carmina Avium DS: What does ambience mean to you? Scape DS: Ambience = Mood. YS: I suppose it depends what you mean by the equals sign! Bad Hotel

Psychoacoustics Nine Components of Sound Loudness The loudness of a sound depends on the intensity of the sound stimulus. A dynamite explosion is loader than that of a cap pistol because of the greater amount of air molecules the dynamite is capable of displacing. Loudness becomes meaningful only if we are able to compare it with something. The sound of a gunshot may be deafening in a small room, but actually go unnoticed if fired in a subway station when a train is roaring past. (A film that use this in the narrative is Sleepers - when a man is shot at the same time as an aircraft is landing) "Equal loudness" Humans are most sensitive to frequencies in the midrange (250 Hz - 5000 Hz) When two sounds, a bass sound and a middle range sound are played at the same decibel, the listener perceive the middle range sound to be louder. This is why a clap of thunder in a horror movie may contain something so unweatherlike as a woman's scream. Rhythm Rhythm is a recurring sound that alternates between strong and weak elements. Attack Sustain

The Real Neuroscience of Creativity | Beautiful Minds So yea, you know how the left brain is really realistic, analytical, practical, organized, and logical, and the right brain is so darn creative, passionate, sensual, tasteful, colorful, vivid, and poetic? No. Just no. Stop it. Please. Thoughtful cognitive neuroscientists such as Anna Abraham, Mark Beeman, Adam Bristol, Kalina Christoff, Andreas Fink, Jeremy Gray, Adam Green, Rex Jung, John Kounios, Hikaru Takeuchi, Oshin Vartanian, Darya Zabelina and others are on the forefront of investigating what actually happens in the brain during the creative process. The latest findings from the real neuroscience of creativity suggest that the right brain/left brain distinction is not the right one when it comes to understanding how creativity is implemented in the brain.* Creativity does not involve a single brain region or single side of the brain. Importantly, many of these brain regions work as a team to get the job done, and many recruit structures from both the left and right side of the brain.

Related: