skip to main content


Title: Design and Evaluation of Human-Machine Interface for NEXUS: A Custom Microassembly System
Microassembly systems utilizing precision robotics have long been used for realizing 3-dimensional microstructures such as microrobots. Prior to assembly, such components are fabricated using Micro-Electro-Mechanical-System (MEMS) technology. The microassembly system then directs a microgripper through automated or human-controlled pick-and-place operations. In this paper, we describe a novel custom microassembly system, named NEXUS. The NEXUS integrates multi-degree of freedom (DOF) precision positioners, microscope computer vision, and micro-scale process tools such as a microgripper and vacuum tip. A semi-autonomous human-machine interface (HMI) is programmed by NI LabVIEW® to allow the operator to interact with the microassembly system. The NEXUS human-machine interface includes multiple functions, such as positioning, target detection, visual servoing, and inspection. The microassembly system’s HMI was used by operators to assemble various 3-dimensional microrobots such as the Solarpede, a novel light-powered stick-and-slip mobile microcrawler. Experimental results are reported in this paper that evaluate the system’s semi-autonomous capabilities in terms of assembly rate and yield and compare them to purely teleoperated assembly performance. Results show that the semi-automated capabilities of the microassembly system’s HMI offer a more consistent assembly rate of microrobot components.  more » « less
Award ID(s):
1828355 1734383
NSF-PAR ID:
10310565
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
14th International Conference on Micro- and Nanosystems (MNS)
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Abstract Microassembly systems utilizing precision robotics have long been used for realizing three-dimensional microstructures such as microsystems and microrobots. Prior to assembly, microscale components are fabricated using micro-electromechanical-system (MEMS) technology. The microassembly system then directs a microgripper through a series of automated or human-controlled pick-and-place operations. In this paper, we describe a novel custom microassembly system, named NEXUS, that can be used to prototype MEMS microrobots. The NEXUS integrates multi-degrees-of-freedom (DOF) precision positioners, microscope computer vision, and microscale process tools such as a microgripper and vacuum tip. A semi-autonomous human–machine interface (HMI) was programmed to allow the operator to interact with the microassembly system. The NEXUS human–machine interface includes multiple functions, such as positioning, target detection, visual servoing, and inspection. The microassembly system's HMI was used by operators to assemble various three-dimensional microrobots such as the Solarpede, a novel light-powered stick-and-slip mobile microcrawler. Experimental results are reported in this paper to evaluate the system's semi-autonomous capabilities in terms of assembly rate and yield and compare them to purely teleoperated assembly performance. Results show that the semi-automated capabilities of the microassembly system's HMI offer a more consistent assembly rate of microrobot components and are less reliant on the operator's experience and skill. 
    more » « less
  2. In this paper, we propose a method for tracking a microrobot’s three-dimensional position using microscope machine vision. The microrobot, theSolid Articulated Four Axis Microrobot (sAFAM), is being developed to enable the assembly and manipulation of micro and nanoscale objects. In the future, arrays of sAFAMS working together can be integrated into a wafer-scale nanofactory, Prior to use, microrobots in this microfactory need calibration, which can be achieved using the proposed measurement technique. Our approach enables faster and more accurate mapping of microrobot translations and rotations, and orders of magnitude larger datasets can be created by automation. Cameras feeds on a custom microscopy system is fed into a data processing pipeline that enables tracking of the microrobot in real-time. This particular machine vision method was implemented with a help of OpenCV and Python and can be used to track the movement of other micrometer-sized features. Additionally, a script was created to enable automated repeatability tests for each of the six trajectories traversable by the robot. A more precise microrobot workable area was also determined thanks to the significantly larger datasets enabled by the combined automation and machine vision approaches. Keywords: Micro robotics, machine vision, nano microscale manufacturing. 
    more » « less
  3. Abstract

    The fabrication of three-dimensional (3D) microscale structures is critical for many applications, including strong and lightweight material development, medical device fabrication, microrobotics, and photonic applications. While 3D microfabrication has seen progress over the past decades, complex multicomponent integration with small or hierarchical feature sizes is still a challenge. In this study, an optical positioning and linking (OPAL) platform based on optical tweezers is used to precisely fabricate 3D microstructures from two types of micron-scale building blocks linked by biochemical interactions. A computer-controlled interface with rapid on-the-fly automated recalibration routines maintains accuracy even after placing many building blocks. OPAL achieves a 60-nm positional accuracy by optimizing the molecular functionalization and laser power. A two-component structure consisting of 448 1-µm building blocks is assembled, representing the largest number of building blocks used to date in 3D optical tweezer microassembly. Although optical tweezers have previously been used for microfabrication, those results were generally restricted to single-material structures composed of a relatively small number of larger-sized building blocks, with little discussion of critical process parameters. It is anticipated that OPAL will enable the assembly, augmentation, and repair of microstructures composed of specialty micro/nanomaterial building blocks to be used in new photonic, microfluidic, and biomedical devices.

     
    more » « less
  4. This paper presents our work over the last decade in developing functional microrobotic systems, which include wireless actuation of microrobots to traverse complex surfaces, addition of sensing capabilities, and independent actuation of swarms of microrobots. We will discuss our work on the design, fabrication, and testing of a number of different mobile microrobots that are able to achieve these goals. These microrobots include the microscale magnetorestrictive asymmetric bimorph microrobot ( μ MAB), our first attempt at magnetic actuation in the microscale; the microscale tumbling microrobot ( μ TUM), our microrobot capable of traversing complex surfaces in both wet and dry conditions; and the micro-force sensing magnetic microrobot ( μ FSMM), which is capable of real-time micro-force sensing feedback to the user as well as intuitive wireless actuation. Additionally, we will present our latest results on using local magnetic field actuation for independent control of multiple microrobots in the same workspace for microassembly tasks. 
    more » « less
  5. Self-driving vehicles are the latest innovation in improving personal mobility and road safety by removing arguably error-prone humans from driving-related tasks. Such advances can prove especially beneficial for people who are blind or have low vision who cannot legally operate conventional motor vehicles. Missing from the related literature, we argue, are studies that describe strategies for vehicle design for these persons. We present a case study of the participatory design of a prototype for a self-driving vehicle human-machine interface (HMI) for a graduate-level course on inclusive design and accessible technology. We reflect on the process of working alongside a co-designer, a person with a visual disability, to identify user needs, define design ideas, and produce a low-fidelity prototype for the HMI. This paper may benefit researchers interested in using a similar approach for designing accessible autonomous vehicle technology. INTRODUCTION The rise of autonomous vehicles (AVs) may prove to be one of the most significant innovations in personal mobility of the past century. Advances in automated vehicle technology and advanced driver assistance systems (ADAS) specifically, may have a significant impact on road safety and a reduction in vehicle accidents (Brinkley et al., 2017; Dearen, 2018). According to the Department of Transportation (DoT), automated vehicles could help reduce road accidents caused by human error by as much as 94% (SAE International, n.d.). In addition to reducing traffic accidents and saving lives and property, autonomous vehicles may also prove to be of significant value to persons who cannot otherwise operate conventional motor vehicles. AVs may provide the necessary mobility, for instance, to help create new employment opportunities for nearly 40 million Americans with disabilities (Claypool et al., 2017; Guiding Eyes for the Blind, 2019), for instance. Advocates for the visually impaired specifically have expressed how “transformative” this technology can be for those who are blind or have significant low vision (Winter, 2015); persons who cannot otherwise legally operate a motor vehicle. While autonomous vehicles have the potential to break down transportation 
    more » « less