Few-shot tabular data enrichment using fine-tuned transformer architectures

Gilad Katz, Asaf Harari

Research output: Contribution to conferencePaperpeer-review

Abstract

The enrichment of tabular datasets using external sources has gained significant attention in recent years. Existing solutions, however, either ignore external unstructured data completely or devise dataset-specific solutions. In this study, we proposed Few-Shot Transformer based Enrichment (FeSTE), a generic and robust framework for the enrichment of tabular datasets using unstructured data. By training over multiple datasets, our approach is able to develop generic models that can be applied to additional datasets with minimal training (i.e., few-shot). Our approach is based on an adaptation of BERT, for which we present a novel finetuning approach that reformulates the tuples of the datasets as sentences. Our evaluation, conducted on 17 datasets, shows that FeSTE is able to generate high quality features and significantly outperform existing fine-tuning solutions.
Original languageEnglish
Pages1577-1591
DOIs
StatePublished - 2022
EventProceedings of the 60th Annual Meeting of the Association for Computational Linguistics -
Duration: 22 May 202227 May 2022

Conference

ConferenceProceedings of the 60th Annual Meeting of the Association for Computational Linguistics
Period22/05/2227/05/22

Fingerprint

Dive into the research topics of 'Few-shot tabular data enrichment using fine-tuned transformer architectures'. Together they form a unique fingerprint.

Cite this