skip to main content

Search for: All records

Creators/Authors contains: "Zhu, Haiyi"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Many AI system designers grapple with how best to collect human input for different types of training data. Online crowds provide a cheap on-demand source of intelligence, but they often lack the expertise required in many domains. Experts offer tacit knowledge and more nuanced input, but they are harder to recruit. To explore this trade off, we compared novices and experts in terms of performance and perceptions on human intelligence tasks in the context of designing a text-based conversational agent. We developed a preliminary chatbot that simulates conversations with someone seeking mental health advice to help educate volunteer listeners at 7cups.com. We then recruited experienced listeners (domain experts) and MTurk novice workers (crowd workers) to conduct tasks to improve the chatbot with different levels of complexity. Novice crowds perform comparably to experts on tasks that only require natural language understanding, such as correcting how the system classifies a user statement. For more generative tasks, like creating new lines of chatbot dialogue, the experts demonstrated higher quality, novelty, and emotion. We also uncovered a motivational gap: crowd workers enjoyed the interactive tasks, while experts found the work to be tedious and repetitive. We offer design considerations for allocating crowd workers andmore »experts on input tasks for AI systems, and for better motivating experts to participate in low-level data work for AI.« less
    Free, publicly-accessible full text available October 14, 2023
  2. Free, publicly-accessible full text available June 20, 2023
  3. Free, publicly-accessible full text available June 20, 2023
  4. Free, publicly-accessible full text available April 29, 2023
  5. Free, publicly-accessible full text available June 20, 2023
  6. Free, publicly-accessible full text available June 20, 2023
  7. Free, publicly-accessible full text available June 13, 2023
  8. Free, publicly-accessible full text available April 27, 2023
  9. Free, publicly-accessible full text available April 27, 2023