The expansion of large-scale neural recording capabilities has provided new opportunities to examine multi-scale cortical network activity at single neuron resolution. At the same time, the growing scale and complexity of these datasets introduce new conceptual and technical challenges beyond what can be addressed using traditional analysis techniques. Here, we present SIMNETS, a mathematically rigorous and efficient unsupervised relational analysis framework designed to generate intuitive, low-dimensional neuron maps that support a multi-scale view of the computational similarity (CS) relations among individual neurons. The critical innovation is the use of a novel measure of computational similarity that is based on comparing the intrinsic structure of latent spaces representing the spiking output of individual neurons. We use three publicly available neural population test datasets from the visual, motor, and hippocampal CA1 brain regions to validate the SIMNETS framework and demonstrate how it can be used to identify putative subnetworks (i.e., clusters of neurons with similar computational properties). Our analysis pipeline includes a novel statistical test designed to evaluate the likelihood of detecting spurious neuron clusters to validate network structure results. The SIMNETS framework can facilitate linking computational geometry representations across scales, from single neurons to subnetworks, within large-scale neural recording data.
Mapping the Computational Similarity of Individual Neurons within Large-scale Ensemble Recordings using the SIMNETS analysis framework
Jacqueline B. Hynes,D. M. Brandman,Jonas B. Zimmermann,J. Donoghue,C. Vargas-Irwin
Published 2018 in bioRxiv
ABSTRACT
PUBLICATION RECORD
- Publication year
2018
- Venue
bioRxiv
- Publication date
2018-11-08
- Fields of study
Biology, Computer Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
CITED BY
- No citing papers are available for this paper.
Showing 0-0 of 0 citing papers · Page 1 of 1