Multi-Source Domain Adaptation with Mixture of Experts

© 2018 Association for Computational Linguistics We propose a mixture-of-experts approach for unsupervised domain adaptation from multiple sources. The key idea is to explicitly capture the relationship between a target example and different source domains. This relationship, expressed by a point-to...

Full description

Bibliographic Details
Main Authors: Guo, Jiang (Author), Shah, Darsh (Author), Barzilay, Regina (Author)
Format: Article
Language:English
Published: Association for Computational Linguistics (ACL), 2021-11-05T11:25:18Z.
Subjects:
Online Access:Get fulltext
Description
Summary:© 2018 Association for Computational Linguistics We propose a mixture-of-experts approach for unsupervised domain adaptation from multiple sources. The key idea is to explicitly capture the relationship between a target example and different source domains. This relationship, expressed by a point-to-set metric, determines how to combine predictors trained on various domains. The metric is learned in an unsupervised fashion using meta-training. Experimental results on sentiment analysis and part-of-speech tagging demonstrate that our approach consistently outperforms multiple baselines and can robustly handle negative transfer.1