Understanding the synchronization of complex oscillator networks is a central question in complex systems research. Recent studies have shown that graph neural networks (GNNs) outperform a wide range of traditional network measures in predicting probabilistic stability in synthetic power grids based on the Kuramoto model. This suggests that analyzing GNN decisions could enhance our understanding of synchronization patterns. We use explainable artificial intelligence (XAI), specifically Layer-wise Relevance Propagation (LRP) and its adaptation for GNNs (GNN-LRP) to analyze these decision processes. Our results indicate that larger neighborhoods beyond direct nodes strongly influence the dynamic behavior, with slightly different patterns for nodes with low stability compared to stable nodes. Aggregating LRP scores provides a nodal measure of stability contribution, correlating with some network measures and suggesting pathways to more stable power grids. However, GNN decision processes appear to be more complex and not only influenced by established node metrics. Our study highlights the potential of GNNs and XAI in understanding synchronization patterns of oscillator networks and emphasizes the need for new XAI methods tailored to this domain.
Explainable AI for analyzing the decision of GNNs at predicting dynamic stability of complex oscillator networks.
H. Raum,T. Schnake,F. Hellmann,J. Kurths,Christian Nauck
Published 2025 in Chaos
ABSTRACT
PUBLICATION RECORD
- Publication year
2025
- Venue
Chaos
- Publication date
2025-11-01
- Fields of study
Medicine, Computer Science, Engineering
- Identifiers
- External record
- Source metadata
Semantic Scholar, PubMed
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-31 of 31 references · Page 1 of 1
CITED BY
- No citing papers are available for this paper.
Showing 0-0 of 0 citing papers · Page 1 of 1