eCommons

 

RANKING PROBLEMS IN THE PRESENCE OF IMPLICIT BIAS

Other Titles

Abstract

Implicit bias is the unconscious attribution of particular qualities (or lack of) to a member from a particular social group (e.g. defined by race or gender). Studies on implicit bias have shown that these unconscious stereotypes can have adverse outcomes in various social contexts, such as job screening, teaching, or policing. This dissertation advocates for an application of fairness based re-ranking methods to improve the fairness to all items which, to some surprise, comes with little cost to or can even improve the utility. We present our key contributions in ranking when in the presence of implicit bias. This includes the development of a theorem where we prove that under simplifying assumptions on the utilities of items, simple, well-studied, constraints can ensure that the utility does not decrease with respect to a naive ranking. Finally, we augment our theoretical results with empirical findings on real-world distributions from the IIT-JEE (2009) dataset.

Journal / Series

Volume & Issue

Description

58 pages

Sponsorship

Date Issued

2022-12

Publisher

Keywords

Artificial Intelligence; Bias and Fairness; Machine Learning

Location

Effective Date

Expiration Date

Sector

Employer

Union

Union Local

NAICS

Number of Workers

Committee Chair

Joachims, Thorsten

Committee Co-Chair

Committee Member

Cardie, Claire

Degree Discipline

Computer Science

Degree Name

M.S., Computer Science

Degree Level

Master of Science

Related Version

Related DOI

Related To

Related Part

Based on Related Item

Has Other Format(s)

Part of Related Item

Related To

Related Publication(s)

Link(s) to Related Publication(s)

References

Link(s) to Reference(s)

Previously Published As

Government Document

ISBN

ISMN

ISSN

Other Identifiers

Rights

Attribution-NoDerivatives 4.0 International

Types

dissertation or thesis

Accessibility Feature

Accessibility Hazard

Accessibility Summary

Link(s) to Catalog Record