skip to main content

Title: Teleoperation Interface for sAFAM, a Solid Articulated Four Axes Microrobot
The sAFAM is a novel mm-size microrobot built using MicroElectroMechanical Systems (MEMS) technology. It consists of a monolithically fabricated microrobotic arm assembled onto four in-plane actuators, capable of moving along four degrees of freedom, including translational movement in X and Y axes as well as pitch and yaw. In this paper, several design modifications were proposed to increase movement precision, stability, and controllability to the sAFAM tip. An interface is developed to assist a human operator accurately position the microrobot tip during nano-object handling. A Python-based graphical user interface (GUI) was programmed to make it intuitive for an operator to use and obtain required tip precision under a microscope. Experimental results demonstrate the functionality of the proposed control solution, and the tip motion resolution using microscope images of the microrobot tip under 20x magnification during operation. The hardware and software requirements for the proposed experimental setup and control platform are discussed in detail.
Authors:
; ; ; ; ; ; ;
Award ID(s):
1828355 1849213
Publication Date:
NSF-PAR ID:
10310574
Journal Name:
15th International Conference on Micro- and Nanosystems (MNS)
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract Microassembly systems utilizing precision robotics have long been used for realizing three-dimensional microstructures such as microsystems and microrobots. Prior to assembly, microscale components are fabricated using micro-electromechanical-system (MEMS) technology. The microassembly system then directs a microgripper through a series of automated or human-controlled pick-and-place operations. In this paper, we describe a novel custom microassembly system, named NEXUS, that can be used to prototype MEMS microrobots. The NEXUS integrates multi-degrees-of-freedom (DOF) precision positioners, microscope computer vision, and microscale process tools such as a microgripper and vacuum tip. A semi-autonomous human–machine interface (HMI) was programmed to allow the operator to interact with the microassembly system. The NEXUS human–machine interface includes multiple functions, such as positioning, target detection, visual servoing, and inspection. The microassembly system's HMI was used by operators to assemble various three-dimensional microrobots such as the Solarpede, a novel light-powered stick-and-slip mobile microcrawler. Experimental results are reported in this paper to evaluate the system's semi-autonomous capabilities in terms of assembly rate and yield and compare them to purely teleoperated assembly performance. Results show that the semi-automated capabilities of the microassembly system's HMI offer a more consistent assembly rate of microrobot components and are less reliant on the operator's experience andmore »skill.« less
  2. Microassembly systems utilizing precision robotics have long been used for realizing 3-dimensional microstructures such as microrobots. Prior to assembly, such components are fabricated using Micro-Electro-Mechanical-System (MEMS) technology. The microassembly system then directs a microgripper through automated or human-controlled pick-and-place operations. In this paper, we describe a novel custom microassembly system, named NEXUS. The NEXUS integrates multi-degree of freedom (DOF) precision positioners, microscope computer vision, and micro-scale process tools such as a microgripper and vacuum tip. A semi-autonomous human-machine interface (HMI) is programmed by NI LabVIEW® to allow the operator to interact with the microassembly system. The NEXUS human-machine interface includes multiple functions, such as positioning, target detection, visual servoing, and inspection. The microassembly system’s HMI was used by operators to assemble various 3-dimensional microrobots such as the Solarpede, a novel light-powered stick-and-slip mobile microcrawler. Experimental results are reported in this paper that evaluate the system’s semi-autonomous capabilities in terms of assembly rate and yield and compare them to purely teleoperated assembly performance. Results show that the semi-automated capabilities of the microassembly system’s HMI offer a more consistent assembly rate of microrobot components.
  3. In this paper, we propose a method for tracking a microrobot’s three-dimensional position using microscope machine vision. The microrobot, theSolid Articulated Four Axis Microrobot (sAFAM), is being developed to enable the assembly and manipulation of micro and nanoscale objects. In the future, arrays of sAFAMS working together can be integrated into a wafer-scale nanofactory, Prior to use, microrobots in this microfactory need calibration, which can be achieved using the proposed measurement technique. Our approach enables faster and more accurate mapping of microrobot translations and rotations, and orders of magnitude larger datasets can be created by automation. Cameras feeds on a custom microscopy system is fed into a data processing pipeline that enables tracking of the microrobot in real-time. This particular machine vision method was implemented with a help of OpenCV and Python and can be used to track the movement of other micrometer-sized features. Additionally, a script was created to enable automated repeatability tests for each of the six trajectories traversable by the robot. A more precise microrobot workable area was also determined thanks to the significantly larger datasets enabled by the combined automation and machine vision approaches. Keywords: Micro robotics, machine vision, nano microscale manufacturing.
  4. Recent advances in precision manufacturing technology and a thorough understanding of the properties of piezoelectric materials have made it possible for researchers to develop innovative microrobotic systems, which draw more attention to the challenges of utilizing microrobots in areas that are inaccessible to ordinary robots. This review paper provides an overview of the recent advances in the application of piezoelectric materials in microrobots. The challenges of microrobots in the direction of autonomy are categorized into four sections: mechanisms, power, sensing, and control. In each section, innovative research ideas are presented to inspire researchers in their prospective microrobot designs according to specific applications. Novel mechanisms for the mobility of piezoelectric microrobots are reviewed and described. Additionally, as the piezoelectric micro-actuators require high-voltage electronics and onboard power supplies, we review ways of energy harvesting technology and lightweight micro-sensing mechanisms that contain piezoelectric devices to provide feedback, facilitating the use of control strategies to achieve the autonomous untethered movement of microrobots.
  5. In modern industrial manufacturing processes, robotic manipulators are routinely used in the assembly, packaging, and material handling operations. During production, changing end-of-arm tooling is frequently necessary for process flexibility and reuse of robotic resources. In conventional operation, a tool changer is sometimes employed to load and unload end-effectors, however, the robot must be manually taught to locate the tool changers by operators via a teach pendant. During tool change teaching, the operator takes considerable effort and time to align the master and tool side of the coupler by adjusting the motion speed of the robotic arm and observing the alignment from different viewpoints. In this paper, a custom robotic system, the NeXus, was programmed to locate and change tools automatically via an RGB-D camera. The NeXus was configured as a multi-robot system for multiple tasks including assembly, bonding, and 3D printing of sensor arrays, solar cells, and microrobot prototypes. Thus, different tools are employed by an industrial robotic arm to position grippers, printers, and other types of end-effectors in the workspace. To improve the precision and cycle-time of the robotic tool change, we mounted an eye-in-hand RGB-D camera and employed visual servoing to automate the tool change process. We thenmore »compared the teaching time of the tool location using this system and compared the cycle time with those of 6 human operators in the manual mode. We concluded that the tool location time in automated mode, on average, more than two times lower than the expert human operators.« less