Unconscious Communications: Connecting with your Audience on a Neurological Level

“The greater part of our intellectual activity goes on unconsciously and unfelt by us”

– Friedrich Nietzsche (1882)

Nietzsche’s comment still rings true today, especially in the realm of communications. As we mentioned in a previous blog post, audience impact is highly dependent on characteristics of a person’s voice, such as their pitch, tone, and amplitude. Recent studies have shown that we do much more than just consciously tune in to these characteristics. Evidence suggests that when we communicate with someone, our brains become “coupled”. According to researchers from Princeton University, two people in dialogue go through a process of interactive alignment by using the same speech sounds, grammatical forms, words and meanings. When one person speaks, they use a certain part of their brain in order to choose their words and way of speaking. This speech will trigger the same part of the listener’s brain, causing them to respond in the same way.

But the coupling of brains is much more profound than just speech imitation. Neurons within the brain oscillate at a rhythm that is similar to that of speech (ranges between 3 and 8 Hz). When two people speak to each other, the rhythm of their speech matches up with the rhythm of their brain oscillations. This is partly why you are more likely to hear and understand a person when watching their face, especially in noisy environments. The movement of the mouth divides speech into syllables, emphasizing the rhythm. This feedback helps our brains to analyze what is being said. According to the same Princeton study, “In noisy situations such as a cocktail party, watching a speaker’s face is the equivalent of turning up the volume by 15 decibels”.

This type of quantifiable communication research is applicable not just to dialogues, but to situations in which we are giving a speech or presentation. Just as two brains couple in a dialogue, different parts of the same brain (such as the vocal and auditory sections) couple during a monologue. Our brains automatically adjust our manner of speaking based on environmental cues. By quantifying this non-verbal communication, we can not only understand it better through analytics, but learn to shape our behaviors in order to make a stronger connection with our audiences on a deep, neurological level.