TY - UNPB
T1 - Recent advances in natural language processing via large pre-trained language models: A survey
AU - Min, Bonan
AU - Ross, Hayley
AU - Sulem, Elior
AU - Veyseh, Amir Pouran Ben
AU - Nguyen, Thien Huu
AU - Sainz, Oscar
AU - Agirre, Eneko
AU - Heinz, Ilana
AU - Roth, Dan
PY - 2021
Y1 - 2021
N2 - Large, pre-trained transformer-based language models such as BERT have drastically changed the Natural Language Processing (NLP) field. We present a survey of recent work that uses these large language models to solve NLP tasks via pre-training then fine-tuning, prompting, or text generation approaches. We also present approaches that use pre-trained language models to generate data for training augmentation or other purposes. We conclude with discussions on limitations and suggested directions for future research.
AB - Large, pre-trained transformer-based language models such as BERT have drastically changed the Natural Language Processing (NLP) field. We present a survey of recent work that uses these large language models to solve NLP tasks via pre-training then fine-tuning, prompting, or text generation approaches. We also present approaches that use pre-trained language models to generate data for training augmentation or other purposes. We conclude with discussions on limitations and suggested directions for future research.
U2 - 10.48550/arXiv.2111.01243
DO - 10.48550/arXiv.2111.01243
M3 - Preprint
BT - Recent advances in natural language processing via large pre-trained language models: A survey
ER -