Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Crowdsourcing technologies rely on groups of people to input information that may be critical for decision-making. This work examines obfuscation in the context of reporting technologies. We show that widespread use of reporting platforms comes with unique security and privacy implications, and introduce a threat model and corresponding taxonomy to outline some of the many attack vectors in this space. We then perform an empirical analysis of a dataset of call logs from a controversial, real-world reporting hotline and identify coordinated obfuscation strategies that are intended to hinder the platform's legitimacy. We propose a variety of statistical measures to quantify the strength of this obfuscation strategy with respect to the structural and semantic characteristics of the reporting attacks in our dataset.more » « less
-
In 1996, philosopher Helen Nissenbaum issued a clarion call concerning the erosion of accountability in society due to the ubiquitous delegation of consequential functions to computerized systems. Using the conceptual framing of moral blame, Nissenbaum described four types of barriers to ac- countability that computerization presented: 1) “many hands,” the problem of attributing moral responsibility for outcomes caused by many moral ac- tors; 2) “bugs,” a way software developers might shrug off responsibility by suggesting software errors are unavoidable; 3) “computer as scapegoat,” shifting blame to computer systems as if they were moral actors; and 4) “ownership without liability,” a free pass to the tech industry to deny re- sponsibility for the software they produce. We revisit these four barriers in relation to the recent ascendance of data-driven algorithmic systems — technology often folded under the heading of machine learning (ML) or ar- tificial intelligence (AI) — to uncover the new challenges for accountability that these systems present. We then look ahead to how one might construct and justify a moral, relational framework for holding responsible parties ac- countable, and argue that the FAccT community is uniquely well-positioned to develop such a framework to weaken the four barriers.more » « less
-
Many high-stakes policies can be modeled as a sequence of decisions along a pipeline. We are interested in auditing such pipelines for both Our empirical focus is on policy decisions made by the New efficiency and equity. Using a dataset of over 100,000 crowdsourced resident requests for po- life-tentially hazardous tree maintenance in New York City, we observe a sequence of city government decisions about whether to inspect and work on a reported incident. At each decision in the pipeline, we define parity definitions and tests to identify inefficient, inequitable treatment. Disparities in resource allocation and scheduling across census tracts are reported as preliminary results.more » « less