skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Set-Based State Estimation of Mobile Robots from Coarse Range Measurements
This paper proposes a localization algorithm for an autonomous mobile robot equipped with binary proximity sensors that only indicate when the robot is within a fixed distance from beacons installed at known positions. Our algorithm leverages an ellipsoidal Set Membership State Estimator (SMSE) that maintains an ellipsoidal bound of the position and velocity states of the robot. The estimate incorporates knowledge of the robot's dynamics, bounds on environmental disturbances, and the binary sensor readings. The localization algorithm is motivated by an underwater scenario where accurate range or bearing measurements are often missing. We demonstrate our approach on an experimental platform using an autonomous blimp.  more » « less
Award ID(s):
1849228 1828678 1934836
PAR ID:
10212084
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
Proceedings of 4th IEEE Conference on Control Technology and Applications
Page Range / eLocation ID:
404 to 409
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. This paper proposes a nudged particle filter for estimating the pose of a camera mounted on flying robots collecting a video sequence. The nudged particle filter leverages two image-to-pose and pose-to-image neural networks trained in an auto-encoder fashion with a dataset of pose-labeled images. Given an image, the retrieved camera pose using the image-to-pose network serves as a special particle to nudge the set of particles generated from the particle filter while the pose-to-image network serves to compute the likelihoods of each particle. We demonstrate that such a nudging scheme effectively mitigates low likelihood samplings during the particle propagation step. Ellipsoidal confidence tubes are constructed from the set of particles to provide a computationally efficient bound on localization error. When an ellipsoidal tube self-intersects, the probability volume of the intersection can be significantly shrunken using a novel Dempster–Shafer probability mass assignment algorithm. Starting from the intersection, a loop closure procedure is developed to move backward in time to shrink the volumes of the entire ellipsoidal tube. Experimental results using the Georgia Tech Miniature Autonomous Blimp platform are provided to demonstrate the feasibility and effectiveness of the proposed algorithms in providing localization and pose estimation based on monocular vision. 
    more » « less
  2. null (Ed.)
    The Georgia Tech Miniature Autonomous Blimp (GT-MAB) needs localization algorithms to navigate to way-points in an indoor environment without leveraging an external motion capture system. Indoor aerial robots often require a motion capture system for localization or employ simultaneous localization and mapping (SLAM) algorithms for navigation. The proposed strategy for GT-MAB localization can be accomplished using lightweight sensors on a weight-constrained platform like the GT-MAB. We train an end-to-end convolutional neural network (CNN) that predicts the horizontal position and heading of the GT-MAB using video collected by an onboard monocular RGB camera. On the other hand, the height of the GT-MAB is estimated from measurements through a time-of-flight (ToF) single-beam laser sensor. The monocular camera and the single-beam laser sensor are sufficient for the localization algorithm to localize the GT-MAB in real time, achieving the averaged 3D positioning errors to be less than 20 cm, and the averaged heading errors to be less than 3 degrees. With the accuracy of our proposed localization method, we are able to use simple proportional-integral-derivative controllers to control the GT-MAB for waypoint navigation. Experimental results on the waypoint following are provided, which demonstrates the use of a CNN as the primary localization method for estimating the pose of an indoor robot that successfully enables navigation to specified waypoints. 
    more » « less
  3. Abstract Simultaneous Localization and Mapping (SLAM) is an autonomous localization technique used for mobile robots without GPS. Since autonomous localization relies on pre-existing maps, to use SLAM with the Robotic Operating System (ROS), a map of the surroundings must first be created, and a controller can then use the initial map. The first mapping procedure is mostly carried out manually, with human intervention. When operating manually, the person operating the robot is responsible for avoiding obstacles and moving the robot to different sections of the space to create a full map of the entire environment. The mapping process, if done manually, is time demanding, and often not feasible. To solve this constraint, which is to construct a map of the environment autonomously without human involvement while avoiding obstacles, the Vector Field Histogram (VFH) technique is implemented in this study by integrating it with SLAM. VFH is a real-time motion planning approach in robotics that uses a statistical representation of the robot’s surroundings known as the histogram grid, to place a strong emphasis on handling modeling errors and sensor uncertainty. Furthermore, using range sensor values, the VFH algorithm determines a robot’s obstacle-free driving directions. Aside from its real-time obstacle avoidance function, the VFH method is enhanced in this study to collaborate with SLAM to create maps and reduce localization complexity. While generating maps, the VFH approach uses a two-step data-reduction procedure to calculate the appropriate vehicle control directives. The robot’s temporary location is used to generate a one-dimensional polar histogram, which is the first stage of the histogram grid reduction process. The polar obstacle density in a given direction is represented by a value in each sector of the polar histogram. In the second stage, the robot’s steering is oriented in the direction of the most appropriate sector, which the algorithm determines from all the polar histogram sectors with a low polar obstacle density. Following that, further algorithms, such as Rapidly Exploring Random Tree (RRT) and A*, can be used to plan autonomous pathways using the map provided by VFH. In order to put the concept into practice, MATLAB and ROS are used together in collaboration to autonomously and simultaneously map the environment and localize the robot. The combination of MATLAB and ROS provides many advantages because of their extensive feature set and ability to integrate with each other. Finally, a simulation and a real-time robot are utilized to analyze and validate the study’s findings. 
    more » « less
  4. Monitoring localization safety will be necessary to certify the performance of robots that operate in life-critical applications, such as autonomous passenger vehicles or delivery drones because many current localization safety methods do not account for the risk of undetected sensor faults. One type of fault, misassociation, occurs when a feature extracted from a mapped landmark is associated to a non-corresponding landmark and is a common source of error in feature-based navigation applications. This paper accounts for the probability of misassociation when quantifying landmark-based mobile robot localization safety for fixed-lag smoothing estimators. We derive a mobile robot localization safety bound and evaluate it using simulations and experimental data in an urban environment. Results show that localization safety suffers when landmark density is relatively low such that there are not enough landmarks to adequately localize and when landmark density is relatively high because of the high risk of feature misassociation. 
    more » « less
  5. null (Ed.)
    This paper presents two methods, tegrastats GUI version jtop and Nsight Systems, to profile NVIDIA Jetson embedded GPU devices on a model race car which is a great platform for prototyping and field testing autonomous driving algorithms. The two profilers analyze the power consumption, CPU/GPU utilization, and the run time of CUDA C threads of Jetson TX2 in five different working modes. The performance differences among the five modes are demonstrated using three example programs: vector add in C and CUDA C, a simple ROS (Robot Operating System) package of the wall follow algorithm in Python, and a complex ROS package of the particle filter algorithm for SLAM (Simultaneous Localization and Mapping). The results show that the tools are effective means for selecting operating mode of the embedded GPU devices. 
    more » « less