Time masking for temporal language models

Guy D. Rosin, Ido Guy, Kira Radinsky

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

4 Scopus citations

Abstract

Our world is constantly evolving, and so is the content on the web. Consequently, our languages, often said to mirror the world, are dynamic in nature. However, most current contextual language models are static and cannot adapt to changes over time. In this work, we propose a temporal contextual language model called TempoBERT, which uses time as an additional context of texts. Our technique is based on modifying texts with temporal information and performing time masking - specific masking for the supplementary time information. We leverage our approach for the tasks of semantic change detection and sentence time prediction, experimenting on diverse datasets in terms of time, size, genre, and language. Our extensive evaluation shows that both tasks benefit from exploiting time masking.

Original languageEnglish
Title of host publicationWSDM 2022 - Proceedings of the 15th ACM International Conference on Web Search and Data Mining
PublisherAssociation for Computing Machinery, Inc
Pages833-841
Number of pages9
ISBN (Electronic)9781450391320
DOIs
StatePublished - 11 Feb 2022
Event15th ACM International Conference on Web Search and Data Mining, WSDM 2022 - Virtual, Online, United States
Duration: 21 Feb 202225 Feb 2022

Publication series

NameWSDM 2022 - Proceedings of the 15th ACM International Conference on Web Search and Data Mining

Conference

Conference15th ACM International Conference on Web Search and Data Mining, WSDM 2022
Country/TerritoryUnited States
CityVirtual, Online
Period21/02/2225/02/22

Keywords

  • Language models
  • Semantic change detection
  • Temporal semantics

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Computer Science Applications
  • Software

Fingerprint

Dive into the research topics of 'Time masking for temporal language models'. Together they form a unique fingerprint.

Cite this