skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Sanchez, Christopher"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Generative AI (genAI) tools, such as ChatGPT or Copilot, are advertised to improve developer productivity and are being integrated into software development. However, misaligned trust, skepticism, and usability concerns can impede the adoption of such tools. Research also indicates that AI can be exclusionary, failing to support diverse users adequately. One such aspect of diversity is cognitive diversity -- variations in users' cognitive styles -- that leads to divergence in perspectives and interaction styles. When an individual's cognitive style is unsupported, it creates barriers to technology adoption. Therefore, to understand how to effectively integrate genAI tools into software development, it is first important to model what factors affect developers' trust and intentions to adopt genAI tools in practice? We developed a theoretically grounded statistical model to (1) identify factors that influence developers' trust in genAI tools and (2) examine the relationship between developers' trust, cognitive styles, and their intentions to use these tools in their work. We surveyed software developers (N=238) at two major global tech organizations: GitHub Inc. and Microsoft; and employed Partial Least Squares-Structural Equation Modeling (PLS-SEM) to evaluate our model. Our findings reveal that genAI's system/output quality, functional value, and goal maintenance significantly influence developers' trust in these tools. Furthermore, developers' trust and cognitive styles influence their intentions to use these tools in their work. We offer practical suggestions for designing genAI tools for effective use and inclusive user experience. 
    more » « less
    Free, publicly-accessible full text available April 28, 2026
  2. Cognitive biases are hardwired behaviors that influence developer actions and can set them on an incorrect course of action, necessitating backtracking. Although researchers have found that cognitive biases occur in development tasks in controlled lab studies, we still do not know how these biases affect developers' everyday behavior. Without such an understanding, development tools and practices remain inadequate. To close this gap, we conducted a two-part field study to examine the extent to which cognitive biases occur, the consequences of these biases on developer behavior, and the practices and tools that developers use to deal with these biases. We found about 70% of observed actions were associated with at least one cognitive bias. Even though developers recognized that biases frequently occur, they are forced to deal with such issues with ad hoc processes and suboptimal tool support. As one participant (IP12) lamented: There is no salvation! 
    more » « less
  3. All robots create consequential sound—sound produced as a result of the robot’s mechanisms—yet little work has explored how sound impacts human-robot interaction. Recent work shows that the sound of different robot mechanisms affects perceived competence, trust, human-likeness, and discomfort. However, the physical sound characteristics responsible for these perceptions have not been clearly identified. In this paper, we aim to explore key characteristics of robot sound that might influence perceptions. A pilot study from our past work showed that quieter and higher-pitched robots may be perceived as more competent and less discomforting. To better understand how variance in these attributes affects perception, we performed audio manipulations on two sets of industrial robot arm videos within a series of four new studies presented in this paper. Results confirmed that quieter robots were perceived as less discomforting. In addition, higher-pitched robots were perceived as more energetic, happy, warm, and competent. Despite the robot’s industrial purpose and appearance, participants seemed to prefer more "cute" (or "kawaii") sound profiles, which could have implications for the design of more acceptable and fulfilling sound profiles for human-robot interactions with practical collaborative robots. 
    more » « less
  4. null (Ed.)
    Cognitive biases are hard-wired behaviors that influence developer actions and can set them on an incorrect course of action, necessitating backtracking. While researchers have found that cognitive biases occur in development tasks in controlled lab studies, we still don't know how these biases affect developers' everyday behavior. Without such an understanding, development tools and practices remain inadequate. To close this gap, we conducted a 2-part field study to examine the extent to which cognitive biases occur, the consequences of these biases on developer behavior, and the practices and tools that developers use to deal with these biases. About 70% of observed actions that were reversed were associated with at least one cognitive bias. Further, even though developers recognized that biases frequently occur, they routinely are forced to deal with such issues with ad hoc processes and sub-optimal tool support. As one participant (IP12) lamented: There is no salvation! 
    more » « less