Theta and Gamma Bands Encode Acoustic Dynamics over Wide-Ranging Timescales

  • Xiangbin Teng
    Department of Neuroscience, Max Planck Institute for Empirical Aesthetics, 60322 Frankfurt, Germany
  • David Poeppel
    Department of Neuroscience, Max Planck Institute for Empirical Aesthetics, 60322 Frankfurt, Germany

Abstract

<jats:title>Abstract</jats:title> <jats:p>Natural sounds contain acoustic dynamics ranging from tens to hundreds of milliseconds. How does the human auditory system encode acoustic information over wide-ranging timescales to achieve sound recognition? Previous work (Teng et al. 2017) demonstrated a temporal coding preference for the theta and gamma ranges, but it remains unclear how acoustic dynamics between these two ranges are coded. Here, we generated artificial sounds with temporal structures over timescales from ~200 to ~30 ms and investigated temporal coding on different timescales. Participants discriminated sounds with temporal structures at different timescales while undergoing magnetoencephalography recording. Although considerable intertrial phase coherence can be induced by acoustic dynamics of all the timescales, classification analyses reveal that the acoustic information of all timescales is preferentially differentiated through the theta and gamma bands, but not through the alpha and beta bands; stimulus reconstruction shows that the acoustic dynamics in the theta and gamma ranges are preferentially coded. We demonstrate that the theta and gamma bands show the generality of temporal coding with comparable capacity. Our findings provide a novel perspective—acoustic information of all timescales is discretised into two discrete temporal chunks for further perceptual analysis.</jats:p>

Journal

  • Cerebral Cortex

    Cerebral Cortex 30 (4), 2600-2614, 2019-11-25

    Oxford University Press (OUP)

Citations (3)*help

See more

Details 詳細情報について

Report a problem

Back to top