<?xml version="1.0" encoding="UTF-8"?><rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:dcq="http://purl.org/dc/terms/"><records count="1" morepages="false" start="1" end="1"><record rownumber="1"><dc:product_type>Conference Paper</dc:product_type><dc:title>Measuring Group Advantage: A Comparative Study of Fair Ranking Metrics</dc:title><dc:creator>Kuhlman, Caitlin; Gerych, Walter; Rundensteiner, Elke</dc:creator><dc:corporate_author/><dc:editor>null</dc:editor><dc:description>Ranking evaluation metrics play an important role in information
retrieval, providing optimization objectives during development
and means of assessment of deployed performance. Recently, fairness
of rankings has been recognized as crucial, especially as automated
systems are increasingly used for high impact decisions.
While numerous fairness metrics have been proposed, a comparative
analysis to understand their interrelationships is lacking. Even
for fundamental statistical parity metrics which measure group
advantage, it remains unclear whether metrics measure the same
phenomena, or when one metric may produce different results than
another. To address these open questions, we formulate a conceptual
framework for analytical comparison of metrics.We prove that
under reasonable assumptions, popular metrics in the literature
exhibit the same behavior and that optimizing for one optimizes
for all. However, our analysis also shows that the metrics vary in
the degree of unfairness measured, in particular when one group
has a strong majority. Based on this analysis, we design a practical
statistical test to identify whether observed data is likely to exhibit
predictable group bias. We provide a set of recommendations for
practitioners to guide the choice of an appropriate fairness metric.</dc:description><dc:publisher/><dc:date>2021-05-01</dc:date><dc:nsf_par_id>10251285</dc:nsf_par_id><dc:journal_name>Proceedings of the 2021 AAAI/ACM Conference on AI, Ethics, and Society (AIES ’21)</dc:journal_name><dc:journal_volume/><dc:journal_issue/><dc:page_range_or_elocation/><dc:issn/><dc:isbn/><dc:doi>https://doi.org/10.1145/3461702.3462588</dc:doi><dcq:identifierAwardId>2007932</dcq:identifierAwardId><dc:subject/><dc:version_number/><dc:location/><dc:rights/><dc:institution/><dc:sponsoring_org>National Science Foundation</dc:sponsoring_org></record></records></rdf:RDF>