skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Harper, F"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Langran, E (Ed.)
    A growing movement towards expanding computer science education in K-12 has broadened gaps in computing opportunities along lines of race, ethnicity, class, and gender. Emergent theories and practices related to culturally responsive computing show promise in addressing this gap; however, little is known about engaging culturally and linguistically diverse preschoolers in computer science. The current study utilized qualitative content analysis to explore how an extant theory of Culturally Responsive Computing aligns with an early childhood culturally relevant robotics curriculum. Findings suggest that while the assumptions of culturally responsive computing were evident throughout the curriculum, there are several key considerations when extending the theory to early childhood contexts. Overarching themes included (1) emphasizing the value of non-digital tools and activities and (2) aligning the goals of culturally responsive computing with children’s current level of social development. 
    more » « less
  2. Algorithmic decision-making systems are increasingly used throughout the public and private sectors to make important decisions or assist humans in making these decisions with real social consequences. While there has been substantial research in recent years to build fair decision-making algorithms, there has been less research seeking to understand the factors that affect people's perceptions of fairness in these systems, which we argue is also important for their broader acceptance. In this research, we conduct an online experiment to better understand perceptions of fairness, focusing on three sets of factors: algorithm outcomes, algorithm development and deployment procedures, and individual differences. We find that people rate the algorithm as more fair when the algorithm predicts in their favor, even surpassing the negative effects of describing algorithms that are very biased against particular demographic groups. We find that this effect is moderated by several variables, including participants' education level, gender, and several aspects of the development procedure. Our findings suggest that systems that evaluate algorithmic fairness through users' feedback must consider the possibility of "outcome favorability" bias. 
    more » « less