New! Sign up for our free email newsletter.
Science News
from research organizations

Syllables that oscillate in neuronal circuits

What neuroscience can say about speech processing in the brain

Date:
June 10, 2015
Source:
University of Geneva
Summary:
Speech, emitted or received, produces an electrical activity in neurons that neuroscientists measure in the form of "cortical oscillations." To understand speech, as for other cognitive or sensory processes, the brain breaks down the information it receives to integrate it and give it a coherent meaning. But researchers could not confirm whether oscillations were signs of neuronal activity, or whether these oscillations played an active role in speech processing. Researchers reached such conclusions after having created a computerized model of neuronal microcircuits, which highlights the crucial role of neuronal oscillations to decode spoken language, independently of speakers' pace or accent.
Share:
FULL STORY

Speech, emitted or received, produces an electrical activity in neurons that neuroscientists measure in the form of "cortical oscillations." To understand speech, as for other cognitive or sensory processes, the brain breaks down the information it receives to integrate it and give it a coherent meaning. But researchers could not confirm whether oscillations were signs of neuronal activity, or whether these oscillations played an active role in speech processing. Two recent publications -- in eLife and in Frontiers in Human Neurosciences -- shed light on the importance of these oscillations which, when they are not produced as they should, can be associated with significant language disorders. Professor Anne-Lise Giraud and her team at the Faculty of Medicine of the University of Geneva (UNIGE) reached such conclusions after having created a computerized model of neuronal microcircuits, which highlights the crucial role of neuronal oscillations to decode spoken language, independently of speakers' pace or accent.

If neuroscientists have suspected for a long time cortical oscillations to contribute to the interpretation of sensory stimuli in the brain, their exact role has never been demonstrated. Brain activity is indeed rhythmic and appears as periodic electrical variations, classified according to their wavelength. The most well-known -alpha, beta, gamma, delta, theta and mu- are detected in association with different types of cognitive activities or states. The alpha waves, for example, are connected to an awake and relaxed state, beta waves to intense concentration, etc. Speech mobilises gamma and theta waves synergistically.

Oscillations for syllables

To precisely identify the neurobiological processes at work when speech is heard by a human brain, Anne-Lise Giraud's team and colleagues at Ecole Normale Supérieure (Paris) built a computerized model of neuronal microcircuits which replicates cerebral waves. Their objective was to discover if the theta and gamma-coupled oscillations observed in the auditory cortex are key to understanding and produce speech, or if they are only its consequence, that is, the expression of electrical activity of neurons mobilized at that time. The experts have thus modelled the two types of oscillations involved in speech processing, theta and gamma, and observed how this dual network worked. Using a large corpus of sentences pronounced by English-language speakers showing a great variety of pace and accents, researchers observed that these coupled oscillations split words in an intelligent way: they adapted to the pace of the speaker and could correctly detect not only the syllabic barriers but also syllables identity. Theta oscillation could follow the syllabic pace in a flexible way and synchronize the activity of gamma waves, which can encipher phonemes.

A phoneme constitutes the smallest unit of spoken language and helps to constitute the word and distinguish it from other words. Synchronizing these two oscillations is therefore crucial to correctly understand speech. These results, published in eLife, confirm the significance of cortical oscillations in deciphering spoken language.

The pathologies of a desynchronised pace

But what happens when the system malfunctions, particularly in the case of dyslexia and autism? Scientists observed that dyslexic people show an anomaly in gamma wave activity, waves which perform phonemic division. As syllabic division is not affected, people with dyslexia can show no trouble in understanding. However, as the format of their mental representation does not match the universal phonemic representation format, learning written language, which is about combining phonemes with letters, becomes difficult.

In people with autism, on the other hand, researchers identified that it is the speech information that is not divided up at the right place, which blocks speech deciphering. After examining functional MRI and electroencephalographic results of thirteen people with autism and thirteen people showing no specific troubles, researchers noticed that gamma and theta waves activity did not engage synergistically in the group with autism: theta waves activity fails to track speech modulations and the regulation of gamma oscillations, essential for deciphering the detailed spoken content of words, does not occur. Language disorders which most autistic people suffer from could therefore be explained by an imbalance between slow and fast auditory oscillations, an anomaly which would prevent the interpreting of sensorial information and would compromise the ability to form coherent conceptual representations.

In addition, the study shows that the more the desynchronisation is significant, the more the verbal disorder is severe, likewise, autistic symptoms on the whole. "Of course, autistic disorders are not summed up by the inability to decipher language," underlines Professor Anne-Lise Giraud. "But this strong correlation between oscillatory anomalies in the auditory cortex and the severity of autism highlights a malfunction of cortical microcircuits, which is certainly present elsewhere in the brain. The phenomenon is most probably symptomatic of a more general issue of segmenting and coding sensory information." These results are the subject of a recent publication in Frontiers in Human Neurosciences.

The neuroscientists are now striving after their next experiment: attempting to change the rhythm of abnormal oscillations, and, if they succeed, observing the consequences of this intervention on speech and other cognitive functions, at both short- and long- terms.


Story Source:

Materials provided by University of Geneva. Note: Content may be edited for style and length.


Journal References:

  1. Delphine Jochaut, Katia Lehongre, Ana Saitovitch, Anne-Dominique Devauchelle, Itsaso Olasagasti, Nadia Chabane, Monica Zilbovicius, Anne-Lise Giraud. Atypical coordination of cortical oscillations in response to speech in autism. Frontiers in Human Neuroscience, 2015; 9 DOI: 10.3389/fnhum.2015.00171
  2. Alexandre Hyafil, Lorenzo Fontolan, Claire Kabdebon, Boris Gutkin, Anne-Lise Giraud. Speech encoding by coupled cortical theta and gamma oscillations. eLife, 2015; 4 DOI: 10.7554/eLife.06213

Cite This Page:

University of Geneva. "Syllables that oscillate in neuronal circuits." ScienceDaily. ScienceDaily, 10 June 2015. <www.sciencedaily.com/releases/2015/06/150610131447.htm>.
University of Geneva. (2015, June 10). Syllables that oscillate in neuronal circuits. ScienceDaily. Retrieved March 28, 2024 from www.sciencedaily.com/releases/2015/06/150610131447.htm
University of Geneva. "Syllables that oscillate in neuronal circuits." ScienceDaily. www.sciencedaily.com/releases/2015/06/150610131447.htm (accessed March 28, 2024).

Explore More

from ScienceDaily

RELATED STORIES