Dexer: Detecting and Explaining Biased Representation in Ranking

Yuval Moskovitch, Jinyang Li, H. V. Jagadish

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

With the growing use of ranking algorithms in real-life decision-making purposes, fairness in ranking has been recognized as an important issue. Recent works have studied different fairness measures in ranking, and many of them consider the representation of different "protected groups", in the top-k ranked items, for any reasonable k. Given the protected groups, confirming algorithmic fairness is a simple task. However, the groups' definitions may be unknown in advance. To this end, we present Dexer, a system for the detection of groups with biased representation in the top-k. Dexer utilizes the notion of Shapley values to provide the users with visual explanations for the cause of bias. We will demonstrate the usefulness of Dexer using real-life data.

Original languageEnglish
Title of host publicationSIGMOD 2023 - Companion of the 2023 ACM/SIGMOD International Conference on Management of Data
PublisherAssociation for Computing Machinery
Pages159-162
Number of pages4
ISBN (Electronic)9781450395076
DOIs
StatePublished - 4 Jun 2023
Event2023 ACM/SIGMOD International Conference on Management of Data, SIGMOD 2023 - Seattle, United States
Duration: 18 Jun 202323 Jun 2023

Publication series

NameProceedings of the ACM SIGMOD International Conference on Management of Data
ISSN (Print)0730-8078

Conference

Conference2023 ACM/SIGMOD International Conference on Management of Data, SIGMOD 2023
Country/TerritoryUnited States
CitySeattle
Period18/06/2323/06/23

Keywords

  • explanations
  • ranking fairness
  • representation bias

ASJC Scopus subject areas

  • Software
  • Information Systems

Fingerprint

Dive into the research topics of 'Dexer: Detecting and Explaining Biased Representation in Ranking'. Together they form a unique fingerprint.

Cite this