skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: From HHI to HRI: Which Facets of Ethical Decision-Making Should Inform a Robot?
Robots, humanoid and otherwise, are being created with the underlying motivation in many cases that they will either replace or complement activities performed by humans. It has been many years since robots were starting to be designed to take over “dull, dirty, or dangerous” tasks (e.g., Singer 2009). Over time, roboticists and others within computing communities have extended their ambitions to create technology that seeks to emulate more complex ranges of human-like behavior, potentially including the ability to participate in complicated conversations. Regardless of how sophisticated its functionality is, a robot should arguably be encoded with ethical decision-making parameters, especially if it is going to interact with or could potentially endanger a human being. Yet of course determining the nature and specification of such parameters raises many longstanding and difficult philosophical questions.  more » « less
Award ID(s):
1848974
PAR ID:
10489991
Author(s) / Creator(s):
; ; ;
Publisher / Repository:
International Conference on Computer Ethics: Philosophical Enquiry
Date Published:
Journal Name:
International Conference on Computer Ethics: Philosophical Enquiry 2023 (CEPE 2023)
Subject(s) / Keyword(s):
Robot Ethics Human Robot interaction Folk Morality Expert morality
Format(s):
Medium: X
Location:
Chicago, IL
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Robots are entering various domains of human societies, potentially unfolding more opportunities for people to perceive robots as social agents. We expect that having robots in proximity would create unique social learning situations where humans spontaneously observe and imitate robots’ behaviors. At times, these occurrences of humans’ imitating robot behaviors may result in a spread of unsafe or unethical behaviors among humans. For responsible robot designing, therefore, we argue that it is essential to understand physical and psychological triggers of social learning in robot design. Grounded in the existing literature of social learning and the uncanny valley theories, we discuss the human-likeness of robot appearance and affective responses associated with robot appearance as likely factors that either facilitate or deter social learning. We propose practical considerations for social learning and robot design. 
    more » « less
  2. The realms of commonsense knowledge and reasoning, vehicle automation with full as well as partial autonomy, and human-robot collaboration, present growing areas of research in recent times, with much of the concerned data being disseminated through the Web and devices encompassing IoT (Internet of Things); the data per se being heterogeneous including plain text, images, audiovisuals, hypertext and hypermedia. Due to the advent of autonomous vehicles, there is a greater need for the embodiment of commonsense knowledge within their development in order to simulate subtle, intuitive aspects of human judgment. The field of robotics has often encountered collaborative tasks between humans and robots to enhance the respective activities involved and produce better results than humans or robots would achieve working by themselves. Accordingly, this article outlines and organizes some of the research occurring in these areas along with its Web perspectives and applications. Context related to human-robot collaboration and commonsense knowledge appears via a survey of the literature. Vehicle automation is significant with the relevant studies: its definition and methods of improvement are of focus in the article. Some work in this area makes an impact on smart manufacturing. There is discussion on how human-robot collaboration is beneficial, and how commonsense knowledge is useful for the collaboration to occur in an enhanced manner. This article would be potentially interesting to various communities, e.g. AI professionals, Web developers, robotics engineers, and data scientists. 
    more » « less
  3. Madden, John D.; Anderson, Iain A.; Shea, Herbert R. (Ed.)
    Ras Labs makes Synthetic Muscle™, which is a class of electroactive polymer (EAP) based materials and actuators that sense pressure (gentle touch to high impact), controllably contract and expand at low voltage (1.5 V to 50 V, including use of batteries), and attenuate force. We are in the robotics era, but robots do have their challenges. Currently, robotic sensing is mainly visual, which is useful up until the point of contact. To understand how an object is being gripped, tactile feedback is needed. For handling fragile objects, if the grip is too tight, breakage occurs, and if the grip is too loose, the object will slip out of the grasp, also leading to breakage. Rigid robotic grippers using a visual feedback loop can struggle to determine the exact point and quality of contact. Robotic grippers can also get a stuttering effect in the visual feedback loop. By using soft Synthetic Muscle™ based EAP pads as the sensors, immediate feedback was generated at the first point of contact. Because these pads provided a soft, compliant interface, the first point of contact did not apply excessive force, allowing the force applied to the object to be controlled. The EAP sensor could also detect a change in pressure location on its surface, making it possible to detect and prevent slippage by then adjusting the grip strength. In other words, directional glide provided feedback for the presence of possible slippage to then be able to control a slightly tighter grip, without stutter, due to both the feedback and the soft gentleness of the fingertip-like EAP pads themselves. The soft nature of the EAP fingertip pad also naturally held the gripped object, improving the gripping quality over rigid grippers without an increase in applied force. Analogous to finger-like tactile touch, the EAPs with appropriate coatings and electronics were positioned as pressure sensors in the fingertip or end effector regions of robotic grippers. This development of using Synthetic Muscle™ based EAPs as soft sensors provided for sensors that feel like the pads of human fingertips. Basic pressure position and magnitude tests have been successful, with pressure sensitivity down to 0.05 N. Most automation and robots are very strong, very fast, and usually need to be partitioned away from humans for safety reasons. For many repetitive tasks that humans do with delicate or fragile objects, it would be beneficial to use robotics; whether it is for agriculture, medical surgery, therapeutic or personal care, or in extreme environments where humans cannot enter, including with contagions that have no cure. Synthetic Muscle™ was also retrofitted as actuator systems into off-the-shelf robotic grippers and is being considered in novel biomimetic gripper designs, operating at low voltages (less than 50 V). This offers biomimetic movement by contracting like human muscles, but also exceeds natural biological capabilities by expanding under reversed electric polarity. Human grasp is gentle yet firm, with tactile touch feedback. In conjunction with shape-morphing abilities, these EAPs also are being explored to intrinsically sense pressure due to the correlation between mechanical force applied to the EAP and its electronic signature. The robotic field is experiencing phenomenal growth in this fourth phase of the industrial revolution, the robotics era. The combination of Ras Labs’ EAP shape-morphing and sensing features promises the potential for robotic grippers with human hand-like control and tactile sensing. This work is expected to advance both robotics and prosthetics, particularly for collaborative robotics to allow humans and robots to intuitively work safely and effectively together. 
    more » « less
  4. Robots increasingly interact with humans through touch, where people are touching or being touched by robots. Yet, little is known about how such interactions shape a user’s experience. To inform future work in this area, we conduct a systematic review of 44 studies on physical human-robot interaction (pHRI). Our review examines the parameters of the touch (e.g., the role of touch, location), the experimental variations used by researchers, and the methods used to assess user experience. We identify five facets of user experience metrics from the questionnaire items and data recordings for pHRI studies. We highlight gaps and methodological issues in studying pHRI and compare user evaluation trends with the Human-Computer Interaction (HCI) literature. Based on the review, we propose a conceptual model of the pHRI experience. The model highlights the components of such touch experiences to guide the design and evaluation of physical interactions with robots and inform future user experience questionnaire development. 
    more » « less
  5. Human-Robot Collaboration (HRC) aims to create environments where robots can understand workspace dynamics and actively assist humans in operations, with the human intention recognition being fundamental to efficient and safe task fulfillment. Language-based control and communication is a natural and convenient way to convey human intentions. However, traditional language models require instructions to be articulated following a rigid, predefined syntax, which can be unnatural, inefficient, and prone to errors. This paper investigates the reasoning abilities that emerged from the recent advancement of Large Language Models (LLMs) to overcome these limitations, allowing for human instructions to be used to enhance human-robot communication. For this purpose, a generic GPT 3.5 model has been fine-tuned to interpret and translate varied human instructions into essential attributes, such as task relevancy and tools and/or parts required for the task. These attributes are then fused with perceived on-going robot action to generate a sequence of relevant actions. The developed technique is evaluated in a case study where robots initially misinterpreted human actions and picked up wrong tools and parts for assembly. It is shown that the fine-tuned LLM can effectively identify corrective actions across a diverse range of instructional human inputs, thereby enhancing the robustness of human-robot collaborative assembly for smart manufacturing. 
    more » « less