A global study comparing the melodic and rhythmic abilities of almost half a million people has found subtle differences between speakers of “tonal” versus “non-tonal” languages.
The massive investigation assessed the musical abilities of people from 203 different countries who are native speakers of 54 different languages.
The research was conducted via the University of Auckland, NZ and University of Yale collaboration The Music Lab., which is a citizen science initiative which seeks to learn now human mind creates and perceives music. Follow the link to their website to find out how you can get involved in their research.
WATCH: Search for 20,000 people to help save the vulnerable Great Barrier Reef without getting wet
A tonal mother-tongue was found to improve a person’s ability to identify slightly different melodies, but at the expense of rhythm. On the other hand, non-tonal speakers are better able to tell when a rhythm is beating in time with music.
Advantages in melodic perception for tonal speakers and rhythmic ability for non-tonal speakers was found to be about half of that which would be gained from taking music lessons, the researchers found.
Languages are considered tonal if the pitch of the word changes not only the nuance of the word, but its core meaning. In the Cambridge Handbook of Phonology (2007) edited by Paul de Lacy, linguist Professor Moira Yip, now at University College London, gives an example of this: “In Cantonese, for example, the syllable [jau] can be said with one of six different pitches, and has six different meanings” from ‘worry’ to ‘again’ to ‘paint’.
In contrast, non-tonal languages like English might use pitch to express emotion or signify a question, but not the core meaning of the word.
Tonal languages account for as much as 70 percent of the world’s languages by some estimates. They include languages spoken by huge numbers of people including Mandarin.
“We grow up speaking and hearing one or more languages, and we think that experience not only tunes our mind into hearing the sounds of those languages but might also influence how we perceive musical sounds like melodies and rhythms,” says Dr Courtney Hilton, a cognitive scientist at Waipapa Taumata Rau (University of Auckland) and Yale, who co-led the study.
A total of 493,100 participants were involved in the citizen science project. Of them, 34,034 were native speakers of tonal languages, 16,868 speak one of six pitch-accented language, and 442,198 had one of 29 non-tonal languages as their mother-tongue.
Research participants were given three musical tasks. One test assessed melody by asking the question: is this melody the same as the others? Another tested rhythm by asking: is the drum beating in time with the song? The final test assessed pitch perception by questioning whether a vocalist is singing in tune.
The participants were given increasingly difficult tests.
Overall, the type of language spoken was found to impact melodic and rhythmic ability, but not pitch perception.
The researchers believe the slight rhythmic disadvantage among tonal speakers may be due to a trade-off in attention. “It’s potentially the case that tonal speakers pay less attention to rhythm and more to pitch, because pitch patterns are more important to communication when you speak a tonal language,” says Hilton.
Read more: Mapping Australian wildlife with the help of drones, AI and citizen scientists
The researchers were also careful to take cultural upbringing into account by assessing languages that have a wide geographic distribution.
“Prior studies mostly just compared speakers of one language to another, usually English versus Mandarin or Cantonese,” says research co-leader Jingxuan Liu, a native Mandarin speaker who started working on the project as an undergraduate student at Duke University in the US. “English and Chinese speakers also differ in their cultural background, and possibly their music exposure and training in school, so it’s very difficult to rule out those cultural factors if you’re just comparing those two groups.”
“We still find this effect even with a wide range of different languages and with speakers who vary a lot in their culture and background, which really supports the idea that the difference in musical processing in tonal language speakers is driven by their common tonal language experience rather than cultural differences,” says Liu.
Variation within tonal and non-tonal language groups was noticed in the study, but more research is needed to determine smaller-scale patterns say the researchers.
Studies such as these, according to the researchers, help to tease out the relationships between music, language and culture in the mind.
The research is published in the journal Current Biology.
Originally published by Cosmos as Does your native tongue give you rhythm or melody?
Evrim Yazgin has a Bachelor of Science majoring in mathematical physics and a Master of Science in physics, both from the University of Melbourne.
Read science facts, not fiction...
There’s never been a more important time to explain the facts, cherish evidence-based knowledge and to showcase the latest scientific, technological and engineering breakthroughs. Cosmos is published by The Royal Institution of Australia, a charity dedicated to connecting people with the world of science. Financial contributions, however big or small, help us provide access to trusted science information at a time when the world needs it most. Please support us by making a donation or purchasing a subscription today.