skip to main content


Title: CANopen Robot Controller (CORC): An Open Software Stack for Human Robot Interaction Development
Interest in the investigation of novel software and control algorithms for wearable robotics is growing. However, entry into this field requires a significant investment in a testing platform. This work introduces CANopen Robot Controller (CORC)—an open source software stack designed to accelerate the development of robot software and control algorithms—justifying its choice of platform, describing its overall structure, and demonstrating its viability on two distinct platforms.  more » « less
Award ID(s):
2024488
NSF-PAR ID:
10284906
Author(s) / Creator(s):
; ; ; ; ; ;
Date Published:
Journal Name:
WeRob 2020: Wearable Robotics: Challenges and Trends
Volume:
27
Issue:
2022
Page Range / eLocation ID:
287-292
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract As artificial intelligence and industrial automation are developing, human–robot collaboration (HRC) with advanced interaction capabilities has become an increasingly significant area of research. In this paper, we design and develop a real-time, multi-model HRC system using speech and gestures. A set of 16 dynamic gestures is designed for communication from a human to an industrial robot. A data set of dynamic gestures is designed and constructed, and it will be shared with the community. A convolutional neural network is developed to recognize the dynamic gestures in real time using the motion history image and deep learning methods. An improved open-source speech recognizer is used for real-time speech recognition of the human worker. An integration strategy is proposed to integrate the gesture and speech recognition results, and a software interface is designed for system visualization. A multi-threading architecture is constructed for simultaneously operating multiple tasks, including gesture and speech data collection and recognition, data integration, robot control, and software interface operation. The various methods and algorithms are integrated to develop the HRC system, with a platform constructed to demonstrate the system performance. The experimental results validate the feasibility and effectiveness of the proposed algorithms and the HRC system. 
    more » « less
  2. The Cassie bipedal robot designed by Agility Robotics is providing academics with a common platform for sharing and comparing algorithms for locomotion, perception, and navigation. This paper focuses on feedback control for standing and walking using the methods of virtual constraints and gait libraries. The designed controller was implemented six weeks after the robot arrived at the University of Michigan and allowed it to stand in place as well as walk over sidewalks, grass, snow, sand, and burning brush. The controller for standing also enables the robot to ride a Segway. Software supporting the work in this paper is available on GitHub. 
    more » « less
  3. null (Ed.)
    Abstract

    Robot-assisted healthcare could help alleviate the shortage of nursing staff in hospitals and is a potential solution to assist with safe patient handling and mobility. In an attempt to off-load some of the physically-demanding tasks and automate mundane duties of overburdened nurses, we have developed the Adaptive Robotic Nursing Assistant (ARNA), which is a custom-built omnidirectional mobile platform with a 6-DoF robotic manipulator and a force sensitive walking handlebar. In this paper, we present a robot-specific neuroadaptive controller (NAC) for ARNA’s mobile base that employs online learning to estimate the robot’s unknown dynamic model and nonlinearities. This control scheme relies on an inner-loop torque controller and features convergence with Lyapunov stability guarantees. The NAC forces the robot to emulate a mechanical system with prescribed admittance characteristics during patient walking exercises and bed moving tasks. The proposed admittance controller is implemented on a model of the robot in a Gazebo-ROS simulation environment, and its effectiveness is investigated in terms of online learning of robot dynamics as well as sensitivity to payload variations.

     
    more » « less
  4. Abstract

    Traditional hard robots often require complex motion‐control systems to accomplish various tasks, while applications of soft‐bodied robots are limited by their low load‐carrying capability. Herein, a hybrid tensegrity robot composed of both hard and soft materials is constructed, mimicking the musculoskeletal system of animals. Employing liquid crystal elastomer–carbon nanotube composites as artificial muscles in the tensegrity robot, it is demonstrated that the robot is extremely deformable, and its multidirectional locomotion can be entirely powered by light. The tensegrity robot is ultralight, highly scalable, has high load capacity, and can be precisely controlled to move along different paths on multiterrains. In addition, the robot also shows excellent resilience, deployability, and impact‐mitigation capability, making it an ideal platform for robotics for a wide range of applications.

     
    more » « less
  5. The emerging potential of augmented reality (AR) to improve 3D medical image visualization for diagnosis, by immersing the user into 3D morphology is further enhanced with the advent of wireless head-mounted displays (HMD). Such information-immersive capabilities may also enhance planning and visualization of interventional procedures. To this end, we introduce a computational platform to generate an augmented reality holographic scene that fuses pre-operative magnetic resonance imaging (MRI) sets, segmented anatomical structures, and an actuated model of an interventional robot for performing MRI-guided and robot-assisted interventions. The interface enables the operator to manipulate the presented images and rendered structures using voice and gestures, as well as to robot control. The software uses forbidden-region virtual fixtures that alerts the operator of collisions with vital structures. The platform was tested with a HoloLens HMD in silico. To address the limited computational power of the HMD, we deployed the platform on a desktop PC with two-way communication to the HMD. Operation studies demonstrated the functionality and underscored the importance of interface customization to fit a particular operator and/or procedure, as well as the need for on-site studies to assess its merit in the clinical realm. Index Terms—augmented reality, robot-assistance, imageguided interventions. 
    more » « less