Fairness Aware Counterfactuals for Subgroups

Publication
NeurIPS 2023 - 37th Conference on Neural Information Processing Systems

We propose novel fairness definitions concerning algorithmic recourse. For an individual that receives an undesirable outcome (e.g., loan application rejected), recourse is a way to reverse the outcome (e.g., increase down payment). Recourse incurs a cost for the individual that can be measured quantitatively. Our definitions investigate whether subpopulations have comparable costs, i.e., bear the same burden, for recourse. We have developed a method, termed FACTS, to audit a model for fairness, i.e., find subgroups where unfairness exists.

image
A discovered subgroup where unfairness of recourse exists.

Our approach is integrated within the IBM AIF 360 tool. See a demo notebook here.