Delay activity dynamics: task dependent time encoding and low dimensional trajectories

Christopher J. Cueva,Encarni Marcos,A. Saez,A. Genovesio,Mehrdad Jazayeri,Mehrdad Jazayeri,R. Romo,C. Salzman,M. Shadlen,Stefano Fusi

Published 2018 in bioRxiv

ABSTRACT

Our decisions often depend on multiple sensory experiences separated by time delays. The brain can remember these experiences (working memory) and, at the same time, it can easily estimate the timing between events, which plays a fundamental role in anticipating stimuli and planning future actions. To better understand the neural mechanisms underlying working memory and time encoding we analyze neural activity recorded during delays in four different experiments on non-human primates and we consider three classes of neural network models to explain the data: attractor neural networks, chaotic reservoir networks and recurrent neural networks trained with backpropagation through time. To disambiguate these models we propose two analyses: 1) decoding the passage of time from neural data, and 2) computing the cumulative dimensionality of the neural trajectory as it evolves over time. Our analyses reveal that time can be decoded with high precision in tasks where timing information is relevant and with lower precision in tasks where it is irrelevant to perform the task, suggesting that working memory need not rely on constant rates around a fixed activity pattern. In addition, our results further constrain the mechanisms underlying time encoding as we show that the dimensionality of the trajectories is low for all datasets. Consistent with this, we find that the linear “ramping” component of each neuron’s firing rate strongly contributes to the slow timescale variations that make decoding time possible. We show that these low dimensional ramping trajectories are beneficial as they allow computations learned at one point in time to generalize across time. Our observations constrain the possible models that explain the data, ruling out simple attractor models and randomly connected recurrent networks (chaotic reservoir networks) that vary on relatively fast timescales, but agree with recurrent neural network models trained with backpropagation through time. Our results demonstrate a powerful new tool for studying the interplay of temporal processing and working memory by objective classification of electrophysiological activity.

PUBLICATION RECORD

  • Publication year

    2018

  • Venue

    bioRxiv

  • Publication date

    2018-12-29

  • Fields of study

    Biology, Physics, Computer Science

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-52 of 52 references · Page 1 of 1