Home >Technology peripherals >AI >For the first time, neural activation of language has been localized to the cellular level

For the first time, neural activation of language has been localized to the cellular level

WBOY
WBOYOriginal
2024-07-16 12:12:59853browse

The highest-resolution neuron map encoding word meaning to date is here.

For the first time, neural activation of language has been localized to the cellular level

1. Human beings can acquire rich and subtle meanings through language, which is crucial to human communication.
  1. Despite increasing understanding of the brain regions that support language and semantic processing, much remains unknown about neurosemantic derivations at the cellular level.
  2. Recently, a research paper published in Nature magazine discovered the fine cortical representation of semantic information by single neurons by tracking the activity of neurons during natural speech processing.
  3. The paper is titled "Semantic encoding for language understanding at single-cell resolution".

    For the first time, neural activation of language has been localized to the cellular level

    1. Paper address: https://www.nature.com/articles/s41586-024-07643-2
  4. This study created the highest-resolution map of neurons to date, which are responsible for Encode the meaning of various words.
  5. The study recorded single-cell activity in the left language-dominated prefrontal cortex as participants listened to semantically distinct sentences and stories. It turns out that across individuals, the brain uses the same standard categories to categorize words – helping us translate sounds into meaning.
  6. These neurons selectively respond to specific word meanings and reliably distinguish words from non-words. Furthermore, rather than responding to words as fixed memory representations, their activity is highly dynamic, reflecting the meaning of the word according to its specific sentence context and independent of its phonetic form.
  7. Overall, this study shows how these cell populations can accurately predict the broad semantic categories of words heard in real time during speech, and how they can track the sentences in which they appear. The study also shows how these hierarchical structures of meaning representations are encoded and how these representations map onto groups of cells. These findings, at the neuronal scale, reveal the fine cortical organization of human semantic representations and begin to elucidate cellular-level meaning processing during language comprehension. fenye one neuron is responsible for everything

The same group of neurons respond to words of similar categories (such as actions or people-related). Research has found that the brain associates certain words with each other (such as "duck" and "egg"), triggering the same neurons. Words with similar meanings (such as "rat" and "rat") trigger more similar patterns than words with different meanings (such as "rat" and "carrot"). Other neurons respond to abstract concepts such as relational words such as "above" and "behind."

The categories assigned to words were similar between participants, suggesting that the human brain groups meanings in the same way.

Prefrontal cortex neurons differentiate words based on their meaning (not their sound). For example, when "Son" is heard, locations associated with family members are activated, but when the homophone "Sun" is heard, these neurons do not respond.

For the first time, neural activation of language has been localized to the cellular level

Mind reading

After this theory was proposed, researchers can, to a certain extent, determine what people are hearing by observing their neuron firing. Although they were unable to reproduce the exact sentences, they were able to make judgments. For example, a sentence contains animal, action, and food, in the order animal, action, and food.

“Getting this level of detail and getting a glimpse of what’s happening at the level of individual neurons is very exciting,” said Vikash Gilja, an engineer at the University of California, San Diego and chief scientific officer of brain-computer interface company Paradromics. He was impressed that the researchers could identify not only the neurons that corresponded to words and their categories, but also the order in which they were spoken.

Gilja says recording information from neurons is much faster than using previous imaging methods. Understanding the natural speed of speech will be important for future efforts to develop brain-computer interface devices, new types of devices that could restore the ability to speak to people who have lost it.

Reference link: https://www.nature.com/articles/d41586-024-02146-6

The above is the detailed content of For the first time, neural activation of language has been localized to the cellular level. For more information, please follow other related articles on the PHP Chinese website!

Statement:
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn