Recently I attended a concert featuring the premier of an up-and-coming composer's work. She gave a brief talk before her piece was played, during which she explained the complex symbology of her work. The musical notes weren't just noises; they were intended to convey a meaning above and beyond a mere sequence of sounds. But if her music really did convey such deep meaning, why did she have to explain it to the audience beforehand? Can music ever express semantic meaning directly, without requiring a composer or someone else to "translate" for us?
Certainly not all music is as difficult to interpret as the piece I heard that night, which featured such innovations as playing every note on the scale simultaneously on different instruments (I've already forgotten what this was supposed to signify). The Flight of the Bumblebee, for example, really does sound sort of like a bumblebee. Do people who don't know that work's title think of bees the first time they hear it? Perhaps more importantly, if they do think of bees, are they thinking about them the same way as if they'd heard the word "bumblebee," or does musical "meaning" necessarily differ from meaning expressed in words?
A team led by Stefan Koelsch believes they have designed a set of experiments that can answer these questions. The experiments rely on a known brain response to "semantic priming." Priming occurs when we are exposed to a word. Continuing with the "bee" example, if we read the word "bee," and then are asked to perform some sort of task on a related word or concept, we work faster and more accurately. We might, for example, be faster at unscrambling the letters VEIH to form the word "hive." We have been primed to think about bees, and so we're better at dealing with bee-related concepts, from honey to stings. But what if we heard Flight of the Bumblebee for the first time, without being told what the song was about? Would we still be better at handling bee-related language? In other words, does bee music prime as effectively as the word bee?
Neuroscientists have known for decades that a specific pattern in brain activity is associated with semantic priming for words. When brain activity while reading is measured with an electroencephalograph (EEG), the results show a strikingly different pattern when a word has been primed compared to unprimed words. The pattern is revealed in a component of the EEG results known as N400.
Koelsch's team wanted to know if music that's associated with a specific word can prime as effectively as the word itself. They recorded dozens different musical clips from commercial CDs, each associated with a word in one of several possible ways. For example, a musical clip might literally sound like a "bird," or it might sound "wide," in a much more abstract sense. In a preliminary experiment, they selected the 88 clips which listeners consistently associated with the "correct" choice.
In their first experiment, listeners heard a musical excerpt or a sentence (a prime), then were presented with a word on-screen that was either related or unrelated to the prime. They then indicated whether the prime and the word were related, while brain activity was measured with an EEG. The chart below shows typical results for four different primes, each paired with the same German word, Weite (wideness):
The blue and purple plots represent the language primes; the graph on the right shows the relevant portion of the EEG results: a clear difference is seen in the area labeled N400, which occurs just under a half-second after the word is seen: When a word has been primed by related language, the N400 is significantly more negative than when the prime is unrelated. The red and orange plots show that there is a similar pattern for related (Strauss' Salome) versus unrelated (Valpola's E-minor piece for accordion) musical primes.
You can listen to these musical excerpts on the Nature site and judge for yourself whether you think they convey meaning. Here's the Strauss, which the pre-testers found to be related to the concept "wide."
Now here's the Valpola, which was said to be unrelated to "wide."
Do you hear "meaning" here? Arguably, the meaning in these musical works was only clear when listeners were overtly asked whether the words were related to the music. To address this concern, the same stimuli were presented to a new set of listeners, but this time, instead of judging whether the words were related to the primes, listeners were simply told to pay close attention because they would be tested on the music and words later. The same N400 activity was observed as in the first experiment.
Koelsch et al. conclude that for these examples, at least, music conveys semantic meaning in the same manner as words. While in some cases listeners weren't as accurate at determining when musical primes were related to the target words (compared to linguistic primes), when the meaning was correctly established, it had the same priming effect as language.
Music, it appears, can convey much more meaning than we thought it did. Perhaps I didn't need that composer to tell me what her work "meant" -- her audience may have been understanding much more than she ever believed it could.
Koelsch, S., Kasper, E., Sammler, D., Schulze, K., Gunter, T., & Friderici, A.D. (2004). Music, language, and meaning: Brain signatures of semantic processing. Nature Neuroscience, 7(3), 302-307.
If music can usefully convey information, then why is the world not all whistling or yodeling?
I work with many foreign-born people who have no idea of our music is supposed to convey, and I'm at a lost trying to explain it to them because I don't know myself.
Examine the music put into televised advertising and see if you can figure out what messages are being conveyed. Generally, I think the point is to irritate the viewers to defeat their defenses against peddlers.
"For I consider that music is, by its very nature, essentially powerless to express anything at all, whether a feeling, an attitude of mind, a psychological mood, a phenomenon of nature, etc....Expression has never been an inherent property of music. That is by no means the purpose of its existence. If, as is nearly always the case, music appears to express something, this is only an illusion and not a reality. It is simply an additional attribute which, by tacit and inveterate agreement, we have lent it, thrust upon it, as a label, a convention - in short, an aspect unconsciously or by force of habit, we have come to confuse with its essential being."
- Igor Stravinsky
why is the world not all whistling or yodeling?
You imply that music isn't very important. Then why is the world spending a fortune on iPods and car stereos?
a label, a convention - in short, an aspect unconsciously or by force of habit, we have come to confuse with its essential being.
That's actually a reasonable objection to this study, and one which the authors acknowledge. Nonetheless, it's interesting that the brain, in some ways at least, appears to make no distinction between the label and the real thing.
This research (or your summary thereof) seems to imply that there is a meaningful and universal relationship between certain sounds and certain meanings. This would indicate that music is actually very different from language, in which (for the vast majority of words) the relation between sound and meaning is completely arbitrary and must be learned, quite like the up-and-coming composer was trying to teach you the 'language' of her music.
I think there is a difference between basic emotional or conceptual meanings that can be conveyed, and the complex symbologies that inspired the composer. She was explaining how she organized the music, how she was able to make sense of her emotional and artistic thoughts. But that organization does not have to be cognitively explicit for the audience. I wrote something about this here.
I think this experiment is intriguing for a number of reasons.
I think that there is already evidence to suggest that we derive meaning from music. Even if you resrict most of it to a mood or emotion. In many movies the soundtrack is literally what makes it a great movie.
Furthermore, our method of communication originated from making systematic noises to each other. Animals communicate with noises, and some are more musical than others (birds). So I think deriving meaning from music is quite plausible.
Music doesn't convey as much information as speech, possibly because it appeared before humans evolved speech like they have today (See this Babel's Dawn post). We add language to music when we want to transmit more specific information; otherwise music seems to most listeners to just convey emotion.
I'm currently reading This is your Brain on Music -- so far, a fascinating read. I may review it here when I'm finished.
Wow, talk about coincidence: I just read wrote about this study today on my blog. Oops!
Well, Chris, you should link to it, then! Two is better than one!
I like the way you describe the N400:
Normally, when people hear a sentence followed by an unexpected word (e.g., "The shackles allow only a little movement. ... Wideness.") their scalp electrical activity sharply dips around 400 miliseconds after the unexpected word. In contrast, no negative dip in electrical activity is seen if the word is more contextually appropriate - such as "restrained." Likewise, this wave does not appear for words that are only grammatically incorrect. Therefore, this wave (called the N400) seems to be a signature of semantic processing by the brain.
Much clearer than my explanation!
So did you hear about this study from the "This is your brain on music" book? I was assigned to read it for my "ERP of Language" class (which meets in about 20 minutes actually, so I need to run).
This and the other language ERP effects definitely convey a "domain general" view of language: N400's for music and pictures, and the similarities of the P600 (the equivalent of N400 for grammar) with oddball sounds etc. Fascinating stuff!
No, I didn't hear about it from the book. As is the case with most CogDaily articles, Greta found it. She might use it for her Psychology Goes to the Movies seminar, I'm not sure.
Assuming we're dealing with 'pure' music -- i.e., no words, no titles, etc., any of which can't help but contaminate the music with extra-musical associations -- I'd point out that when Gordon Worley says it "conveys emotion", or when 'TH' writes: "...there is already evidence to suggest that we derive meaning from music", although 'derive' might be pretty close I think that 'recall' might be a more accurate word. Certainly as regards specific meanings, but also regarding less-specific emotional states, whatever music 'conveys' has to be already assigned by the culture (along its accompanying myriad layers of subcultures as well). I've yet to see (hear?) even a single purely musical 'message' or state that is universally recognized by every human, no matter their time or culture.
I think there's a lot of confusion about what music and language are. They both consist of sounds: it's just that with language we've all agreed what each sound means, and we're "trained" in it from an extremely early age. There is much, much less consistency in music than language (in terms of style, association, construction, etc). As a music theorist, I teach musicians about the systems of music, and we get into a lot of issues of meaning - within each tradition of music, just as within each language, there are lots of accepted "meanings" (certain kinds of half step descents are sighs, etc.), and there are lots of structural and functional "meanings." The wonderful thing is that most such meanings are multivalent and often ambiguous, in some ways making music's "meaning" much richer (because less specific) than language's. I don't put a lot of stock in scientific studies of musical perception, since, again, "training" in it is so inconsistent (even among musicians).
And relating to just how difficult all this is in plain old Language, let alone Music, it's worth reading this small article by Colin Bower in The New English Review:
Question: why are we consistently looking for links between language and music?
Exactly when did we start to theorize music as a subclass of language, and under what circumstance? Why is this hypothetical relationship so much more compelling and durable than others (e.g. in comparison to a hypothetical link between dance and architecture)?
why are we consistently looking for links between language and music?
We look for links when they provide fruitful results, and the relationship between music and language has delivered.
Also, I think the music studies just tend to get more press because so many people like music. Greta published a very interesting paper on the relationship between mental rotation and representational momentum, but somehow that didn't "catch on" in the media.
I believe that music impacts everyone of us differently. God created some of us to hate music, and others to love. However, even for thoes that hate music, it still elicits some for of emotion -- just by saying and deciding that one likes or hates music. Emotion, is the basis of our language. Every sentence we speak, or thought we think, is supported by emotion. Thus, I would argue that music is not a form of language, rather, it is the supports the way in which we do essentially communicate by slicitng emotion. For example, in Symphonie Fantastique, by Berlioz, one can only postuate why he wrote what he did, and the story that he is trying to communicate -- yet I believe that they only solid, concrete conclusion one can make is the emotion they felt when listening to that piece of music. The percise story conveyed can never be really known.
I think that the meaning that people "derive" from music is more about relation to other events in their life. Most people associate toned-down, low-sounding music with sadness, probably because in movies, that type of music is played when a sad event is going on. It would be very unlikely for Mozart's "Jupiter" to be played while the main character in a movie is dying of cancer. So, like others said, I think that association with certain emotions or words (for the listener, not the composer, as they have their own personal meaning behind any work) is due to media. If one had never viewed a movie or TV show that has music in it, they would never associate it with any emotion because they would never have seen it playing alongside and emotional event.
late question: have there been any studies done where persons who have not been socialised with western music have been asked to try and convey it's meaning? Or vice versa? I'd like to know if an African bushman could label Barbers "Adagio for Strings" as (desperately) "sad" or "tragic" or whether it would have no meaning. Similarly, could a study be done with persons who have their hearing restored in adult life (immediately after the operation, and before socialization via TV etc.)?
American cognitive psychologist, neuroscientist, record producer, musician and writer, Daniel J. Levitin will be speaking at Chautauqua. He has worked as a producer and sound designer on albums by Blue Ãyster Cult, Chris Isaak and Joe Satriani; as a consultant to Stevie Wonder and Steely Dan; and as a recording engineer for Santana and The Grateful Dead.
He is speaking at The Chautauqua Institution the week of August 13-17th. Chautauqua is a great place to vacation. I recommend you check out the schedule. For further information you can review the program brochure located at: http://www.ciweb.org/SUMMERatchautauqua_web.pdf
Plan a great summer full of intellectual and spiritual nourishment.
In Indian Classical music we have something called, Ragaa which is a specific combination notes. Each raga evokes a certain emotion like love, sorrow, hatered, peace etc. You do not have to know classical music to feel it, just by listening to those notes you can feel the emotion. There must be something to the theory that music conveys meaning.
I'm NEW to this website, thus a bit LATE to comment, but I want to take #14's response a step further. I think listeners discover and/or assign meanings to the music they hear... even if there IS text! As a professional musician who looks for (even creates) DRAMA in the classical music I play, I find the audience picks up on that SENSE of drama, hopefully enjoying it more. For ME, music WITHOUT emotion, drama or direction is not worth playing.
In Western music, extra-musical conventions were DEVELOPED to IMITATE or express common human events/states (storms, horse galloping, sighs, grief, joy, pastorale). Clearly these are properties of language, albeit less efficient than words. But the great strength of music (and other fine arts) is that it ALSO expresses events/states in ways that words CANNOT. Like the end of a great symphonic movement where a profound modulation REDEEMS the tragic into triumph! With repeated experiences of the same music (years), MEANING, if sought, justifies listening! But it certainly won't be exactly the "meaning" the composer had in mind.
The philosopher Suzanne Langer once argued that music, as a human activity, probably came along before languages were developed.
Personally I think it's safe to say that language contains and uses musical elements, but the reverse is unlikely. Music is not a language. It's something else altogether.
Birds do not toss and turn sleeplessly at night wishing they could sing Mozart's "Batti batti o bel Masetto."