Practical Fact Checking System for LLMs

Gilad Fuchs, Oded Zinman, Ido Ben-Shaul

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

The use of Large Language Models (LLM) like ChatGPT in real-world product solutions is significantly limited by the well-known issue of hallucinations. Various methods exist in order to overcome this issue automatically, such as using a different LLM to provide feedback on the accuracy of the generated text or by examining the output consistency given multiple sampled responses. However, these approaches do not guarantee factual accuracy, which is crucial in many specialized domains. In order to enhance the factual correctness of generated text by LLM models, a combination of manual annotations and supportive tools are required. We suggest a practical fact checking system tailored specifically for LLMs which combines a hybrid approach (human and machine) to evaluate the correctness of the generated text. This is particularly vital in fields where hallucinations present significant challenges. We use proprietary LLMs, both directly and through Retrieval-Augmented Generation (RAG), to offer users informed feedback on potential hallucinations via a user-friendly interface. We apply our methodology to the task of generating aspect values for video games in listings from an E-commerce marketplace, demonstrating the utility of our approach.

Original languageEnglish
Title of host publicationWWW Companion 2025 - Companion Proceedings of the ACM Web Conference 2025
PublisherAssociation for Computing Machinery, Inc
Pages2713-2716
Number of pages4
ISBN (Electronic)9798400713316
DOIs
StatePublished - 23 May 2025
Externally publishedYes
Event34th ACM Web Conference, WWW Companion 2025 - Sydney, Australia
Duration: 28 Apr 20252 May 2025

Publication series

NameWWW Companion 2025 - Companion Proceedings of the ACM Web Conference 2025

Conference

Conference34th ACM Web Conference, WWW Companion 2025
Country/TerritoryAustralia
CitySydney
Period28/04/252/05/25

Keywords

  • Fact-checking
  • Hallucinations
  • Large Language Models

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Software

Fingerprint

Dive into the research topics of 'Practical Fact Checking System for LLMs'. Together they form a unique fingerprint.

Cite this