The structure of biological neural circuits—modular, hierarchical, and sparse—reflects an efficient trade-off between wiring cost, functional specialization, energy efficiency, and robustness. These principles offer valuable insights for artificial neural network (ANN) design, especially as networks grow in depth and scale. Sparsity, in particular, has been widely explored for reducing memory and computation, improving speed, and enhancing generalization. Motivated by systems neuroscience findings, we explore how patterns of functional connectivity in the mouse visual cortex—specifically, ensemble-to-ensemble communication—can inform ANN design. We introduce G2GNet, a novel architecture that imposes sparse, modular connectivity across feedforward layers. To our knowledge, this is the first architecture to incorporate biologically-observed functional-connectivity patterns as a structural bias in ANN design. We complement this static bias with a dynamic sparse training (DST) mechanism that prunes and regrows edges during training. We also propose a Hebbian-inspired rewiring rule based on activation correlations, drawing on principles of biological plasticity. Despite having significantly fewer parameters than fully-connected models, G2GNet achieves up to 75% sparsity while improving accuracy by up to 4.3% on benchmarks, including Fashion-MNIST, CIFAR-10, and CIFAR-100, outperforming dense baselines with far fewer computations.
Neuro-Inspired Ensemble-to-Ensemble Communication Primitives for Sparse & Efficient ANNs
Orestis Konstantaropoulos,S. Smirnakis,Maria Papadopouli
Published 2025 in International Conferences on Biological Information and Biomedical Engineering
ABSTRACT
PUBLICATION RECORD
- Publication year
2025
- Venue
International Conferences on Biological Information and Biomedical Engineering
- Publication date
2025-08-19
- Fields of study
Computer Science, Engineering
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-60 of 60 references · Page 1 of 1
CITED BY
- No citing papers are available for this paper.
Showing 0-0 of 0 citing papers · Page 1 of 1