다운로드
Meta Self-training for Few-shot Neural Sequence Labeling [Code]
10월 2021
This is the implementation of the paper Meta Self-training for Few-shot Neural Sequence Labeling. MetaST is short for meta-learning for self-training.
Meta Representation Transformation for Low-resource Cross-Lingual Learning [Code]
5월 2021
This is a source code release for a published research at NAACL 2021. Paper Title: MetaXL: Meta Representation Transformation for Low-resource Cross-Lingual Learning Paper Abstract: The combination of multilingual pre-trained representations and cross-lingual transfer learning is one of the most…
Self-training with Weak Supervision [Code]
4월 2021
State-of-the-art deep neural networks require large-scale labeled training data that is often either expensive to obtain or not available for many tasks. Weak supervision in the form of domain-specific rules has been shown to be useful in such settings to…
Microsoft Icecaps: An Open-Source Toolkit for Conversation Modeling
8월 2019
Microsoft Icecaps is a new open-source NLP toolkit featuring pre-trained models and an emphasis on conversational scenarios