Peer review process
Revised: This Reviewed Preprint has been revised by the authors in response to the previous round of peer review; the eLife assessment and the public reviews have been updated where necessary by the editors and peer reviewers.
Read more about eLife’s peer review process.Editors
- Reviewing EditorJonas ObleserUniversity of Lübeck, Lübeck, Germany
- Senior EditorBarbara Shinn-CunninghamCarnegie Mellon University, Pittsburgh, United States of America
Reviewer #1 (Public review):
Summary:
This paper reports an intracranial SEEG study of speech coordination, where participants synchronize their speech output with a virtual partner that is designed to vary its synchronization behavior. This allows the authors to identify electrodes throughout the left hemisphere of the brain that have activity (both power and phase) that correlates with the degree of synchronization behavior. They find that high-frequency activity in secondary auditory cortex (superior temporal gyrus) is correlated to synchronization, in contrast to primary auditory regions. Furthermore, activity in inferior frontal gyrus shows a significant phase-amplitude coupling relationship that is interpreted as compensation for deviation from synchronized behavior with the virtual partner.
Strengths:
(1) The development of a virtual partner model trained for each individual participant, which can dynamically vary its synchronization to the participant's behavior in real time, is novel and exciting.
(2) Understanding real-time temporal coordination for behaviors like speech is a critical and understudied area.
(3) The use of SEEG provides the spatial and temporal resolution necessary to address the complex dynamics associated with the behavior.
(4) The paper provides some results that suggest a role for regions like IFG and STG in the dynamic temporal coordination of behavior both within an individual speaker and across speakers performing a coordination task.
Reviewer #2 (Public review):
Summary:
This paper investigates the neural underpinnings of an interactive speech task requiring verbal coordination with another speaker. To achieve this, the authors recorded intracranial brain activity from the left (and to a lesser extent, the right) hemisphere in a group of drug-resistant epilepsy patients while they synchronised their speech with a 'virtual partner'. Crucially, the authors were able to manipulate the degree of success of this synchronisation by programming the virtual partner to either actively synchronise or desynchronise their speech with the participant, or else to not vary its speech in response to the participant (making the synchronisation task purely one-way). Using such a paradigm, the authors identified different brain regions that were either more sensitive to the speech of the virtual partner (primary auditory cortex), or more sensitive to the degree of verbal coordination (i.e. synchronisation success) with the virtual partner (left secondary auditory cortex and bilateral IFG). Such sensitivity was measured by (1) calculating the correlation between the index of verbal coordination and mean power within a range of frequency bands across trials, and (2) calculating the phase-amplitude coupling between the behavioural and brain signals within single trials (using the power of high-frequency neural activity only). Overall, the findings help to elucidate some of the brain areas involved in interactive speaking behaviours, particularly highlighting high-frequency activity of the bilateral IFG as a potential candidate supporting verbal coordination.
Strengths:
This study provides the field with a convincing demonstration of how to investigate speaking behaviours in more complex situations that share many features with real-world speaking contexts e.g. simultaneous engagement of speech perception and production processes, the presence of an interlocutor and the need for inter-speaker coordination. The findings thus go beyond previous work that has typically studied solo speech production in isolation, and represent a significant advance in our understanding of speech as a social and communicative behaviour. It is further an impressive feat to develop a paradigm in which the degree of cooperativity of the synchronisation partner can be so tightly controlled; in this way, this study combines the benefits of using pre-recorded stimuli (namely, the high degree of experimental control) with the benefits of using a live synchronisation partner (allowing the task to be truly two-way interactive, an important criticism of other work using pre-recorded stimuli). A further key strength of the study lies in its employment of stereotactic EEG to measure brain responses with both high temporal and spatial resolution, an ideal method for studying the unfolding relationship between neural processing and this dynamic coordination behaviour.
Weaknesses:
One limitation of the current study is the relatively sparse coverage of the right hemisphere by the implanted electrodes (91 electrodes in the right compared to 145 in the left). Of course, electrode location is solely clinically motivated, and so the authors did not have control over this. In a previous version of this article, the authors therefore chose not to include data from the right hemisphere in reported analyses. However, after highlighting previous literature suggesting that the right hemisphere likely has high relevance to verbal coordination behaviours such as those under investigation here, the authors have now added analyses of the right hemisphere data to the results. These confirm an involvement of the right hemisphere in this task, largely replicating left hemisphere results. Some hemispheric differences were found in responses within the STG; however, interpretation should be tempered by an awareness of the relatively sparse coverage of the right hemisphere meaning that some regions have very few electrodes, resulting in reduced statistical power.