If you don't remember your password, you can reset it by entering your email address and clicking the Reset Password button. You will then receive an email that contains a secure link for resetting your password
If the address matches a valid account an email will be sent to __email__ with instructions for resetting your password
Maastricht Brain Imaging Center, Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, 6229 EV, Maastricht, the Netherlands
Maastricht Brain Imaging Center, Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, 6229 EV, Maastricht, the Netherlands
Maastricht Brain Imaging Center, Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, 6229 EV, Maastricht, the Netherlands
Faculty of Health Medicine and Life Sciences, Maastricht University, 6229 EV, Maastricht, the Netherlands
Maastricht Brain Imaging Center, Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, 6229 EV, Maastricht, the Netherlands
Maastricht Brain Imaging Center, Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, 6229 EV, Maastricht, the Netherlands
Corresponding author. Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, P.O. Box 616, 6200, MD, Maastricht, Netherlands.
Maastricht Brain Imaging Center, Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, 6229 EV, Maastricht, the Netherlands
Amplitude-rise time (ART), the duration from the onset of an acoustic signal to its maximum amplitude, is a major constituent of the amplitude envelope of auditory speech. ARTs spanning approximately one theta cycle (4–8Hz, corresponding to a period of approximately 200 ms) are thought to play a key role in speech encoding and comprehension [
]. The amplitude envelope of speech conveys information about phrasal structures, word boundaries, speech prosody, and the identity of syllables and phonemes [
]. It has been suggested that low-frequency (theta) cortical oscillations may subserve the segmentation and identification of syllabic information by synchronizing their phases to peaks of the amplitude envelope that resemble the timing of syllables [
]. However, it is still unclear whether theta cortical oscillations contribute functionally to ART perception.
Transcranial alternating current stimulation (tACS) is a non-invasive technique that enables to modulate the excitability of neuronal ensembles by temporally aligning brain oscillations to the alternating current [
]. Whether these effects are mediated by linguistic processes or lower-level, ART-related auditory processes is still unclear. To test the latter idea, we investigated whether tACS-modulated slow cortical oscillations can influence the perception of ART in non-speech sounds.
We applied 4-Hz tACS over the auditory cortices and assessed ART perception with a two-interval forced-choice task that required the 23 participants to identify which of two randomly ordered tones in noise had a longer ART (Fig. 1a). The ART of the target tone was variable and always longer than that of the reference tone (62.5 ms). The relative timing of tACS and the tone on-ramps was varied across six phase lags spanning one tACS cycle (Fig. 1b; see Supplemental Material). Hypothesizing that slow cortical oscillations contribute functionally to ART processing, we predicted that the experimental phase-lag changes induce cyclical changes in ART-discrimination performance. We included a sham-stimulation condition to additionally test whether tACS irrespective of its phase influences ART perception.
Fig. 1(a) The time course of a single trial. Each trial involved four intervals: tone presentation, response period, feedback period and inter-trial interval. The width of the columns is proportional to the duration of each interval. The values in the third row represent durations in ms. The two digits “1” and “2” were shown continuously on the screen. During presentation of the first tone, the size of digit “1” was slightly increased, and analogously for the second tone/digit. The first tone was presented 125 ms after trial onset. The interval between two tones was fixed to 325 ms. Approximately 250 ms (jittered across trials) after the second tone was presented, both digits turned white, prompting participants to respond. Participants received feedback after each response coded by a change of color of the digits (green = correct, red = incorrect, pink = miss). The digits turned grey during the inter-trial interval, which varied in length. The background noise and the electric stimulation were continuously presented. (b) The relative timing between the tACS stimulation (i.e., sinusoidal curves) and the tones. Different sinusoidal curves represent the six phase lag conditions. In panel (c), ART-discrimination performance is shown as a function of distance from best lag during the tACS (black) and sham (grey) stimulation, and in panel (d) it was averaged across the best-lag distances presumed to resemble a positive (i.e., −60° and 60°; dark bar) and negative half-cycle (i.e., 120° and 240°; light bar). The grey dashed line corresponds to the average performance in the sham condition. Error bars represent the standard errors of mean. n.s. = non-significant. (For interpretation of the references to color in this figure legend, the reader is referred to the Web version of this article.)
ART-discrimination performance scores in the six phase-lag conditions were calculated and concatenated to construct a behavioral time series for each stimulation condition. To compensate for potentially confounding inter-individual brain-anatomy differences, the maximum of the time series (the ‘best’ lag, Fig. S2) was aligned across participants and excluded from subsequent analyses [
A two-way repeated measures ANOVA including the Type of stimulation (tACS or sham) and the five Phase lags revealed no significant interaction (F4,88 = 0.84, p = .503) or main effect (Phase lag: F4,88 = 1.6, p = .181; Type of stimulation: F1,22 = 3.3, p = .083) (Fig. 1c), suggesting that neither tACS nor its phase affected ART perception. To further test for a phase effect, we compared the average performance in phase-lag conditions around the best lag (−60° and 60°) vs. the opposite phase-lag conditions (120° and 240°), which revealed no significant difference either (t22 = 0.20, p = .422; Fig. 1d). Similarly, regressing single-trial responses onto phase lags [
] revealed no reliable difference between participants’ regression coefficients vs. zero (average beta value = 0.16, Fisher’s p = .449). Similar results were obtained when applying the same analyses to data stratified according to different ARTs (see Fig.S4).
These results provide no evidence that slow cortical oscillations play a functional role in ART perception. One potential interpretation is that these oscillations affect ART perception as originally hypothesized, but we failed to detect this due to potential methodological limitations. The sensitivity of our measure of ART perception was perhaps suboptimal, as the onset of the first tone in a given trial might have phase-reset brain oscillations and consequently distorted any tACS-induced brain phase at the onset of the second tone. In anticipation of this risk, and to circumvent the use of a more criterion-dependent single-interval yes/no task, we presented tones at a low sound level (44 dB SPL) in continuous noise. However, whether these measures sufficed to prevent tone-induced phase resets remains unclear and would have required directly measuring brain oscillations. The strengths of our tACS and experimental manipulation were likely sufficient to modulate brain oscillations as they have proven effective in some, although not all, speech-perception studies with similar statistical power [
]. A perhaps more exciting interpretation is that slow cortical oscillations contribute less to the perception of ART, but more to its linguistic interpretation. The short tone stimuli in our experiment provided a continuum of basic acoustic differences without any linguistic information. When linguistic stimuli are used, subtle changes in ART or cortical phase have been observed to affect the categorical perception of phonemes (e.g.,/d/vs/t/, short/a/vs long/a:/) [
]. Even if slow cortical oscillations were not found to systematically affect ART perception here, it may be interesting to verify this in the future in a population with chronic ART-processing deficits, e.g., dyslexic participants with phonological impairment, who may be more susceptible to cortical phase modulations [
In sum, the current study provides no evidence for a causal contribution of slow cortical oscillations to the perception of auditory ARTs. Together with positive findings from related speech-perception studies, our null finding suggests that slow cortical oscillations may contribute to linguistic categorization, rather than lower-level auditory processing of ART.
Declaration of competing interest
The authors declare no competing interests.
Author contributions
L.R., A.A., and M.B. designed research.L.R. contributed materials/analytic tools. A.A., M.Z., and M.W. performed research. M.Z. and S.H. analyzed the data. M.Z., S.H., and L.R. wrote the paper and made the figures. M.B., A.A., S.H., and M.W. commented on drafts.
Acknowledgment
This work was supported by Maastricht University and China Scholarship Council (CSC 201706010369 to M.Z.; CSC 201906320078 to M.W.).
Appendix A. Supplementary data
The following is the Supplementary data to this article: