TY - JOUR
T1 - Bidirectional Encoder Representations from Transformers in Radiology
T2 - A Systematic Review of Natural Language Processing Applications
AU - Gorenstein, Larisa
AU - Konen, Eli
AU - Green, Michael
AU - Klang, Eyal
N1 - Publisher Copyright:
© 2024 American College of Radiology
PY - 2024/6/1
Y1 - 2024/6/1
N2 - Introduction: Bidirectional Encoder Representations from Transformers (BERT), introduced in 2018, has revolutionized natural language processing. Its bidirectional understanding of word context has enabled innovative applications, notably in radiology. This study aimed to assess BERT's influence and applications within the radiologic domain. Methods: Adhering to Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, we conducted a systematic review, searching PubMed for literature on BERT-based models and natural language processing in radiology from January 1, 2018, to February 12, 2023. The search encompassed keywords related to generative models, transformer architecture, and various imaging techniques. Results: Of 597 results, 30 met our inclusion criteria. The remaining were unrelated to radiology or did not use BERT-based models. The included studies were retrospective, with 14 published in 2022. The primary focus was on classification and information extraction from radiology reports, with x-rays as the prevalent imaging modality. Specific investigations included automatic CT protocol assignment and deep learning applications in chest x-ray interpretation. Conclusion: This review underscores the primary application of BERT in radiology for report classification. It also reveals emerging BERT applications for protocol assignment and report generation. As BERT technology advances, we foresee further innovative applications. Its implementation in radiology holds potential for enhancing diagnostic precision, expediting report generation, and optimizing patient care.
AB - Introduction: Bidirectional Encoder Representations from Transformers (BERT), introduced in 2018, has revolutionized natural language processing. Its bidirectional understanding of word context has enabled innovative applications, notably in radiology. This study aimed to assess BERT's influence and applications within the radiologic domain. Methods: Adhering to Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, we conducted a systematic review, searching PubMed for literature on BERT-based models and natural language processing in radiology from January 1, 2018, to February 12, 2023. The search encompassed keywords related to generative models, transformer architecture, and various imaging techniques. Results: Of 597 results, 30 met our inclusion criteria. The remaining were unrelated to radiology or did not use BERT-based models. The included studies were retrospective, with 14 published in 2022. The primary focus was on classification and information extraction from radiology reports, with x-rays as the prevalent imaging modality. Specific investigations included automatic CT protocol assignment and deep learning applications in chest x-ray interpretation. Conclusion: This review underscores the primary application of BERT in radiology for report classification. It also reveals emerging BERT applications for protocol assignment and report generation. As BERT technology advances, we foresee further innovative applications. Its implementation in radiology holds potential for enhancing diagnostic precision, expediting report generation, and optimizing patient care.
KW - Language models
KW - natural language processing
KW - radiology
UR - http://www.scopus.com/inward/record.url?scp=85188219846&partnerID=8YFLogxK
U2 - 10.1016/j.jacr.2024.01.012
DO - 10.1016/j.jacr.2024.01.012
M3 - Review article
C2 - 38302036
AN - SCOPUS:85188219846
SN - 1546-1440
VL - 21
SP - 914
EP - 941
JO - Journal of the American College of Radiology
JF - Journal of the American College of Radiology
IS - 6
ER -