Researchers map individual words to specific brain regions

Researchers with the University of California, Berkeley, have detailed how the so-called "semantic system" in the human brain works, and their work could one day help form treatments for injuries and diseases that affect one's ability to speak. The study's lead author Alex Huth was one of several volunteers who listened to more than two hours' worth of radio shows while positioned inside an fMRI machine, shedding light on how the brain reacts to words and, eventually helping create a map of sorts.

The brain's outer layer of tissue is known as the cerebral cortex, and it is known to have a role in some of humanity's higher functions including language abilities. The fMRI recordings allowed researchers to see how this part of the brain reacted to the audio it was hearing. Rather than just observing which parts of the brain lit up, they matched those active parts to particular words that played when the region lit up.

By doing so, the researchers were able to create what is essentially a map of word clusters associated with activity in various parts of the brain. Both halves of the brain are involved in the semantic system, with more than 100 areas in the cerebral cortex showing activations. Words, at least in some cases, activate in parts of the brain related to each word's meaning — or, in the cases of some words, more than one region could illuminate with activity if a word has multiple meanings.

One example is the word 'top,' which could mean an article of clothing (e.g., a shirt), or it could also mean a position, such as the top of a bookcase. It isn't surprising, then, that researchers observed areas of the brain that handle measurements and areas that are correlated with clothing and appearance-related words both showed activity upon hearing the word.

The map shows both large regions and the parts of language that fall within them, but also allow researchers to narrow their focus on a specific area or specific category of word types and see how they're processed by the brain. Among other things, such detailed maps could one day allow mute patients to 'speak' by monitoring their brain activity and translating it into words.

VIA: BBC