This is Scientific American 60-Second Science. I'm Cynthia Graber. Got a minute?
Language can take many forms,spoken, written, even gesticulated, as with American Sign Language.
Regardless of the language form, the left hemisphere of the brain dominates the information processing.
But the right hemisphere plays a greater role in processing acoustics, pitch and melodies.
Which is why researchers were curious about how the brain processes whistled Turkish.
Yes, that's Turkish, but being whistled rather than spoken.
Before the advent of telephones, mountain communities separated by valleys modified the spoken language into sounds that can be heard up to a couple of kilometers away.
For instance means, "Do you have fresh bread?"
And translates to: "Who won the game?"
One member of the research team said he first could not recognize the whistles as Turkish, but was able to pick out words within a week.
So what's going on in the brains that hear and understand these sounds?
When researchers played spoken Turkish syllables through headphones, the subjects' right ears did the most work.
Again, the right ear links to the brain's left hemisphere, the usual primary site for spoken information processing.
But when whistled Turkish syllables were played into the headphones, the subjects' left and right ears shared the task equally, indicating that the two brain hemispheres are both heavily involved in working out the whistles.
The findings are in the journal Current Biology.
The researchers write that "a natural but acoustically different language can create a radical change in the organization dynamics of language asymmetries."
In other words, the brain will adjust quickly to make sense of incoming information.
Oh, and that first whistle you heard?
It means "I speak whistled Turkish."
Thanks for the minute, for Scientific American 60-Second Science. I'm Cynthia Graber.
如果对题目有疑问,欢迎来提出你的问题,热心的小伙伴会帮你解答。
精听听写练习