This paper introduces DuTerm, a novel two-stage architecture for terminology-constrained machine translation. Our system combines a terminology-aware NMT model, adapted via fine-tuning on large-scale synthetic data, with a prompt-based LLM for post-editing. The LLM stage refines NMT output and enforces terminology adherence. We evaluate DuTerm on English-to German, English-to-Spanish, and English-to-Russian with the WMT 2025 Terminology Shared Task corpus. We demonstrate that flexible, context-driven terminology handling by the LLM consistently yields higher quality translations than strict constraint enforcement. Our results highlight a critical trade-off, revealing that an LLM's work best for high-quality translation as context-driven mutators rather than generators.
It Takes Two: A Dual Stage Approach for Terminology-Aware Translation
Published 2025 in Proceedings of the Tenth Conference on Machine Translation
ABSTRACT
PUBLICATION RECORD
- Publication year
2025
- Venue
Proceedings of the Tenth Conference on Machine Translation
- Publication date
2025-11-07
- Fields of study
Linguistics, Computer Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-13 of 13 references · Page 1 of 1
CITED BY
Showing 1-2 of 2 citing papers · Page 1 of 1