Multi-Domain Neural Machine Translation

Por favor, use este identificador para citar o enlazar este ítem: http://hdl.handle.net/10045/76088
Información del item - Informació de l'item - Item information
Título: Multi-Domain Neural Machine Translation
Autor/es: Tars, Sander | Fishel, Mark
Palabras clave: Machine Translation
Área/s de conocimiento: Lenguajes y Sistemas Informáticos
Fecha de publicación: 2018
Editor: European Association for Machine Translation
Cita bibliográfica: Tars, Sander; Fishel, Mark. “Multi-Domain Neural Machine Translation”. In: Pérez-Ortiz, Juan Antonio, et al. (Eds.). Proceedings of the 21st Annual Conference of the European Association for Machine Translation: 28-30 May 2018, Universitat d'Alacant, Alacant, Spain, pp. 259-268
Resumen: We present an approach to neural machine translation (NMT) that supports multiple domains in a single model and allows switching between the domains when translating. The core idea is to treat text domains as distinct languages and use multilingual NMT methods to create multi-domain translation systems; we show that this approach results in significant translation quality gains over fine-tuning. We also explore whether the knowledge of pre-specified text domains is necessary; turns out that it is after all, but also that when it is not known quite high translation quality can be reached, and even higher than with known domains in some cases.
Patrocinador/es: This work was supported by the Estonian Research Council grant no. 1226.
URI: http://hdl.handle.net/10045/76088
ISBN: 978-84-09-01901-4
Idioma: eng
Tipo: info:eu-repo/semantics/conferenceObject
Derechos: © 2018 The authors. This article is licensed under a Creative Commons 3.0 licence, no derivative works, attribution, CC-BY-ND.
Revisión científica: si
Versión del editor: http://eamt2018.dlsi.ua.es/proceedings-eamt2018.pdf
Aparece en las colecciones:Congresos - EAMT2018 - Proceedings

Archivos en este ítem:
Archivos en este ítem:
Archivo Descripción TamañoFormato 
ThumbnailEAMT2018-Proceedings_28.pdf1,54 MBAdobe PDFAbrir Vista previa


Este ítem está licenciado bajo Licencia Creative Commons Creative Commons