skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Award ID contains: 1751314

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. The social impacts of computer technology are often glorified in public discourse, but there is growing concern about its actual effects on society. In this article, we ask: how does “consent” as an analytical framework make visible the social dynamics and power relations in the capture, extraction, and labor of data science knowledge production? We hypothesize that a form of boundary violation in data science workplaces—gender harassment—may correlate with the ways humans’ lived experiences are extracted to produce Big Data. The concept of consent offers a useful way to draw comparisons between gender relations in data science and the means by which machines are trained to learn and reason. Inspired by how Big Tech leaders describe unsupervised machine learning, and the co-optation of “revolutionary” rhetoric they use to do so, we introduce a concept we call “techniques of invisibility.” Techniques of invisibility are the ways in which an extreme imbalance between exposure and opacity, demarcated along fault lines of power, are fabricated and maintained, closing down the possibility for bidirectional transparency in the production and applications of algorithms. Further, techniques of invisibility, which we group into two categories—epistemic injustice and the Brotherhood—include acts of subjection by powerful actors in data science designed to quell resistance to exploitative relations. These techniques may be useful in making further connections between epistemic violence, sexism, and surveillance, sussing out persistent boundary violations in data science to render the social in data science visible, and open to scrutiny and debate. 
    more » « less
  2. Articulating a Succinct Description uses ethnographic data to create case study interventions facilitated with people who belong to the culture with whom the ethnographer is engaged. We do so in order to disseminate research findings, address problems presented in the case, and collect additional data for further collective analysis. Further, Articulating a Succinct Description is designed as a means of intervention for underrepresented group members to be heard and gain support and promote equity engagement among majority members in efforts to create more inclusive cultures. In this paper, we validate this method using findings from its application with engineering students at a public university. This method allowed us to view engineering culture not as monolithic, but rather as one with multiple sets of cultural beliefs, values, and behaviors. In particular, we noted a behavior among students we’ve called Swing Staters, who expressed meritocratic beliefs, yet, who we argue, may be critical to reducing bias in engineering education. These findings, analyzed along interwoven threads of race and gender, demonstrate the efficacy of the Articulating a Succinct Description method and contribute to efforts in engineering education to advance pedagogical tools to reduce bias and exclusions in these fields. 
    more » « less