skip to main content


Title: Human-in-the-loop: Role in Cyber Physical Agricultural Systems
With increasing automation, the ‘human’ element in industrial systems is gradually being reduced, often for the sake of standardization. Complete automation, however, might not be optimal in complex, uncertain environments due to the dynamic and unstructured nature of interactions. Leveraging human perception and cognition can prove fruitful in making automated systems robust and sustainable. “Human-in-the-loop” (HITL) systems are systems which incorporate meaningful human interactions into the workflow. Agricultural Robotic Systems (ARS), developed for the timely detection and prevention of diseases in agricultural crops, are an example of cyber-physical systems where HITL augmentation can provide improved detection capabilities and system performance. Humans can apply their domain knowledge and diagnostic skills to fill in the knowledge gaps present in agricultural robotics and make them more resilient to variability. Owing to the multi-agent nature of ARS, HUB-CI, a collaborative platform for the optimization of interactions between agents is emulated to direct workflow logic. The challenge remains in designing and integrating human roles and tasks in the automated loop. This article explains the development of a HITL simulation for ARS, by first realistically modeling human agents, and exploring two different modes by which they can be integrated into the loop: Sequential, and Shared Integration. System performance metrics such as costs, number of tasks, and classification accuracy are measured and compared for different collaboration protocols. The results show the statistically significant advantages of HUB-CI protocols over the traditional protocols for each integration, while also discussing the competitive factors of both integration modes. Strengthening human modeling and expanding the range of human activities within the loop can help improve the practicality and accuracy of the simulation in replicating a HITL-ARS.  more » « less
Award ID(s):
1839971
NSF-PAR ID:
10297614
Author(s) / Creator(s):
;
Date Published:
Journal Name:
INTERNATIONAL JOURNAL OF COMPUTERS COMMUNICATIONS & CONTROL
Volume:
16
Issue:
2
ISSN:
1841-9836
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Acoustofluidics, by combining acoustics and microfluidics, provides a unique means to manipulate cells and liquids for broad applications in biomedical sciences and translational medicine. However, it is challenging to standardize and maintain excellent performance of current acoustofluidic devices and systems due to a multiplicity of factors including device-to-device variation, manual operation, environmental factors, sample variability, etc. Herein, to address these challenges, we propose “intelligent acoustofluidics” – an automated system that involves acoustofluidic device design, sensor fusion, and intelligent controller integration. As a proof-of-concept, we developed intelligent acoustofluidics based mini-bioreactors for human brain organoid culture. Our mini-bioreactors consist of three components: (1) rotors for contact-free rotation via an acoustic spiral phase vortex approach, (2) a camera for real-time tracking of rotational actions, and (3) a reinforcement learning-based controller for closed-loop regulation of rotational manipulation. After training the reinforcement learning-based controller in simulation and experimental environments, our mini-bioreactors can achieve the automated rotation of rotors in well-plates. Importantly, our mini-bioreactors can enable excellent control over rotational mode, direction, and speed of rotors, regardless of fluctuations of rotor weight, liquid volume, and operating temperature. Moreover, we demonstrated our mini-bioreactors can stably maintain the rotational speed of organoids during long-term culture, and enhance neural differentiation and uniformity of organoids. Comparing with current acoustofluidics, our intelligent system has a superior performance in terms of automation, robustness, and accuracy, highlighting the potential of novel intelligent systems in microfluidic experimentation. 
    more » « less
  2. Rapid advances in production systems’ models and technology continually challenge manufacturers preparing for the factories of the future. To address the complexity issues typically coupled with the improvements, we have developed a brain-inspired model for production systems, HUBCI. It is a virtual Hub for Collaborative Intelligence, receiving human instructions from a human-computer interface; and in turn, commanding robots via ROS. The purpose of HUB-CI is to manage diverse local information and real-time signals obtained from system agents (robots, humans, and warehouse components, e.g., carts, shelves, racks) and globally update real-time assignments and schedules for those agents. With Collaborative Control Theory (CCT) we first develop the protocol for collaborative requirement planning for a HUB-CI, (CRP-H), through which we can synchronize the agents to work smoothly and execute rapidly changing tasks. This protocol is designed to answer: Which robot(s) should perform each human-assigned task, and when should this task be performed? The primary two phases of CRP-H, CRP-I (task assignment optimization) and CRP-II (agents schedule harmonization) are developed and validated for two test scenarios: a two-robot collaboration system with five tasks; and a two-robot-and-helper-robot collaboration system with 25 tasks. Simulation results indicate that under CRP-H, both operational cost and makespan of the production work are significantly reduced in the two scenarios. 
    more » « less
  3. Current scientific experiments frequently involve control of specialized instruments (e.g., scanning electron mi- croscopes), image data collection from those instruments, and transfer of the data for processing at simulation centers. This process requires a “human-in-the-loop” to perform those tasks manually, which besides requiring a lot of effort and time, could lead to inconsistencies or errors. Thus, it is essential to have an automated system capable of performing remote instrumentation to intelligently control and collect data from the scientific instruments. In this paper, we propose a Remote Instrumentation Science Environment (RISE) for intelligent im- age analytics that provides the infrastructure to securely capture images, determine process parameters via machine learning, and provide experimental control actions via automation, under the premise of “human-on-the-loop”. The machine learning in RISE aids an iterative discovery process to assist researchers to tune instrument settings to improve the outcomes of experiments. Driven by two scientific use cases of image analytics pipelines, one in material science, and another in biomedical science, we show how RISE automation leverages a cutting-edge integration of cloud computing, on-premise HPC cluster, and a Python programming interface available on a microscope. Using web services, we implement RISE to perform automated image data collection/analysis guided by an intelligent agent to provide real-time feedback control of the microscope using the image analytics outputs. Our evaluation results show the benefits of RISE for researchers to obtain higher image analytics accuracy, save precious time in manually controlling the microscopes, while reducing errors in operating the instruments. 
    more » « less
  4. null (Ed.)
    An important problem in designing human-robot systems is the integration of human intent and performance in the robotic control loop, especially in complex tasks. Bimanual coordination is a complex human behavior that is critical in many fine motor tasks, including robot-assisted surgery. To fully leverage the capabilities of the robot as an intelligent and assistive agent, online recognition of bimanual coordination could be important. Robotic assistance for a suturing task, for example, will be fundamentally different during phases when the suture is wrapped around the instrument (i.e., making a c- loop), than when the ends of the suture are pulled apart. In this study, we develop an online recognition method of bimanual coordination modes (i.e., the directions and symmetries of right and left hand movements) using geometric descriptors of hand motion. We (1) develop this framework based on ideal trajectories obtained during virtual 2D bimanual path following tasks performed by human subjects operating Geomagic Touch haptic devices, (2) test the offline recognition accuracy of bi- manual direction and symmetry from human subject movement trials, and (3) evalaute how the framework can be used to characterize 3D trajectories of the da Vinci Surgical System’s surgeon-side manipulators during bimanual surgical training tasks. In the human subject trials, our geometric bimanual movement classification accuracy was 92.3% for movement direction (i.e., hands moving together, parallel, or away) and 86.0% for symmetry (e.g., mirror or point symmetry). We also show that this approach can be used for online classification of different bimanual coordination modes during needle transfer, making a C loop, and suture pulling gestures on the da Vinci system, with results matching the expected modes. Finally, we discuss how these online estimates are sensitive to task environment factors and surgeon expertise, and thus inspire future work that could leverage adaptive control strategies to enhance user skill during robot-assisted surgery. 
    more » « less
  5. Abstract

    There is a growing understanding that cross‐sector risks faced by critical infrastructure assets in natural disasters require a collaborative foresight from multiple disciplines. However, current contributions to infrastructure interdependency analysis remain centered in discipline‐specific methodologies often constrained by underlying theories and assumptions. This perspective article contributes to ongoing discussions about the uses, challenges, and opportunities provided by interdisciplinary research in critical infrastructure interdependency analysis. In doing so, several modes of integration of computational modeling with contributions from the social sciences and other disciplines are explored to advance knowledge that can improve the infrastructure system resilience under extreme events. Three basic modes of method integration are identified and discussed: (a) integrating engineering models and social science research, (b) engaging communities in participative and collaborative forms of social learning and problem solving using simulation models to facilitate synthesis, exploration, and evaluation of scenarios, and (c) developing interactive simulations where IT systems and humans act as “peers” leveraging the capacity of distributed networked platforms and human‐in‐the‐loop architectures for improving situational awareness, real‐time decision making, and response capabilities in natural disasters. Depending on the conceptualization of the issues under investigation, these broadly defined modes of integration can coalesce to address key issues in promoting interdisciplinary research by outlining potential areas of future inquiry that would be most beneficial to the critical infrastructure protection communities.

     
    more » « less