Detailed Information

High-Speed Stereo Visual SLAM for Low-Powered Computing Devices

Cited 1 time in Web of Science Cited 1 time in Scopus

Kumar, Ashish; Park, Jaesik; Behera, Laxmidhar

Issue Date
Institute of Electrical and Electronics Engineers Inc.
IEEE Robotics and Automation Letters, Vol.9 No.1, pp.499-506
We present an accurate and GPU-accelerated Stereo Visual SLAM design called Jetson-SLAM. It exhibits frame-processing rates above 60FPS on NVIDIA's low-powered 10W Jetson-NX embedded computer and above 200 FPS on desktop-grade 200 W GPUs, even in stereo configuration and in the multiscale setting. Our contributions are threefold: (i) a Bounded Rectification technique to prevent tagging many non-corner points as a corner in FAST detection, improving SLAM accuracy. (ii) A novel Pyramidal Culling and Aggregation (PyCA) technique that yields robust features while suppressing redundant ones at high speeds by harnessing a GPU device. PyCA uses our new Multi-Location Per Thread culling strategy (MLPT) and Thread-Efficient Warp-Allocation (TEWA) scheme for GPU to enable Jetson-SLAM achieving high accuracy and speed on embedded devices. (iii) Jetson-SLAM library achieves resource efficiency by having a data-sharing mechanism. Our experiments on three challenging datasets: KITTI, EuRoC, and KAIST-VIO, and two highly accurate SLAM backends: Full-BA and ICE-BA show that Jetson-SLAM is the fastest available accurate and GPU-accelerated SLAM system (Fig. 1). Fig. 1. (a) Output of Jetson-SLAM's GPU-accelerated and resource-efficient Frontend-Middle-end design, (b) the output trajectory, (c) Frames-Per-Second benchmarking on Jetson-NX embedded computer, and (d) SLAM performance on a KITTI sequence.
Files in This Item:
There are no files associated with this item.
Appears in Collections:

Related Researcher

  • College of Engineering
  • Dept. of Computer Science and Engineering
Research Area Computer Graphics, Computer Vision, Machine Learning


Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.