TSAD: Architecture Design for Time Series Forecasting

Chenyu Jiang

Published 2025 in 2025 International Conference on Big Data Applications, Mechatronics Engineering and Automation (BDAMEA)

ABSTRACT

While Transformers have revolutionized the field of time series forecasting, they still largely depend on manually designed components—applying uniform types of attention, normalization, and activation functions across diverse contexts. What if, instead, we could automatically determine the most suitable architecture for each unique dataset? This paper introduces TSAD (Time Series Architecture Design), an end-to-end neural architecture search (NAS) framework tailored for time series Transformers. TSAD simultaneously optimizes multiple architectural dimensions—including attention mechanisms, normalization layers, activation functions, and embedding operations—through a differentiable architecture search strategy. This approach enables the discovery of customized model configurations that consistently outperform handcrafted baseline models. Our empirical analysis reveals a surprising and counterintuitive phenomenon: architectures discovered via NAS, despite their high complexity, often suffer from significant performance deterioration. In contrast, simpler and more structurally transparent designs tend to deliver stronger and more stable generalization. Our work not only extends the frontier of automated model design for sequential data but also provides critical insights that may reshape how neural architecture search is approached in time series modeling.

PUBLICATION RECORD

  • Publication year

    2025

  • Venue

    2025 International Conference on Big Data Applications, Mechatronics Engineering and Automation (BDAMEA)

  • Publication date

    2025-12-12

  • Fields of study

    Not labeled

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-27 of 27 references · Page 1 of 1

CITED BY

  • No citing papers are available for this paper.

Showing 0-0 of 0 citing papers · Page 1 of 1