13 voluntary and new technologies: mapping neural activity. This is what happens in our brains when we speak and listen
Man’s ability to express himself through language is still a mysterious and fascinating thing.
We take it for granted that each of us can understand the meaning of tens of thousands of words perfect, ideal, topics and concepts extraordinarily different.
But we don’t know how this is possible. owing to mapping neural activities With 13 people at various stages of listening to sentences and stories performed by a team of neuroscientists at Massachusetts General Hospital, Ziv Williams, we now have an idea of how this happened.
Innovative technique
through testing innovative techniques actually kind of”dictionary of synonyms and antonyms” is used by neurons to infer the meaning of the words we hear.
And perhaps more importantly, it is proven by analysisthe activity of a certain number of neurons, this is also possible understand what the person is listening or thinking at that moment.
Therefore, discovery can be very useful to oppose One of the most disabling consequences for most stroke survivors: theaphasia.
Lesions remaining in the regions of the cerebral cortex dedicated to language functions actually render these patients unable to understand or verbally express themselves correctly.
Results of the study
A new technique used by American researchers made it possible simultaneously record the activity of a hundred neurons. In this way, scientists discovered how the same thing happened match the words with the meanings details e come distinguish certain meanings from others.
The primary processing of language input is perceptual and occurs in the auditory cortex for speech or the visual cortex for reading.
From there, information flows to a language-selective amodal network located to the left of the frontal and temporal regions. they adapt word forms to meanings same e unites them in representations.
This is how the sentence is constructed conceptually and verbally.
The processing of meaning extracted from language also involves widespread areas outside this selective network.
In fact,semantic processing can be distributed in the cortex it alternatively it can be summed up in several semantic centers working meaning from language and other modalities.
It is on this basis that the new technology has surpassed the results of traditional neuroscience approaches.
How neurons behave
Recordings were performed on participants undergoing planned intraoperative neurophysiology.
I 13 volunteers reviewed studies by wake up and therefore provides a unique opportunity to study the dynamics of language-based tasks action potentials of single neurons during the comprehension phase.
By monitoring the activity of neurons during natural speech processing, it was determined that they responded selectively to specific meanings words and have reliably distinguishes words from non-words. Furthermore, rather than responding to words as fixed memory representations, their activities were highly dynamic, reflects the meaning of words depending on the context specific to the sentence e regardless of phonetic form, therefore by their voice.
Expectation of the most likely meaning
Collectively, the research shows how precisely these sets of cells predict broad semantic categories of words While they are being listened to in real time during the performance and how traced the sentences in which they appeared.
Some neurons i can distinguish different words with similar sound and they work continuously guess their most likely meaning based on the context of the sentence.
Another important observation is that Neuronal activity is highly context dependent reflective i meanings of words based on specific sentences they heard even when phonetically indistinguishable.