Meta-LMTC: Meta-Learning for Large-Scale Multi-Label Text Classification

Published in EMNLP, 2021

Ran Wang, Xi’ao Su, Siyu Long, Xinyu Dai, Shujian Huang, Jiajun Chen

Large-scale multi-label text classification (LMTC) tasks often face long-tailed label distributions, where many labels have few or even no training instances. Although current methods can exploit prior knowledge to handle these few/zero-shot labels, they neglect the metaknowledge contained in the dataset that can guide models to learn with few samples. In this paper, for the first time, this problem is addressed from a meta-learning perspective. However, the simple extension of meta-learning approaches to multi-label classification is suboptimal for LMTC tasks due to long-tailed label distribution and coexisting of few- and zeroshot scenarios. We propose a meta-learning approach named META-LMTC. Specifically, it constructs more faithful and more diverse tasks according to well-designed sampling strategies and directly incorporates the objective of adapting to new low-resource tasks into the metalearning phase. Extensive experiments show that META-LMTC achieves state-of-the-art performance against strong baselines and can still enhance powerful BERTlike models.

Download here