Skip to content

HxCa1/BEV-LIO-LC

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

BEV-LIO(LC)

BEV Image Assisted LiDAR-Inertial Odometry with Loop Closure

IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2025).


Code arXiv YouTube Bilibili

Pipeline

1. Prerequisites

1.1 Ubuntu and ROS

Ubuntu == 20.04

ROS == Noetic. ROS Installation

(Other versions haven't been tested)

1.2. PCL && Eigen

PCL >= 1.8, Follow PCL Installation.

Eigen >= 3.3.4, Follow Eigen Installation.

GTSAM >= 4.0.0(tested on 4.0.0-alpha2)

1.3. livox_ros_driver

Follow livox_ros_driver Installation.

Remarks:

  • Since the FAST-LIO must support Livox serials LiDAR firstly, so the livox_ros_driver must be installed and sourced before run any FAST-LIO luanch file.
  • How to source? The easiest way is add the line source $Livox_ros_driver_dir$/devel/setup.bash to the end of file ~/.bashrc, where $Livox_ros_driver_dir$ is the directory of the livox ros driver workspace (should be the ws_livox directory if you completely followed the livox official document).

1.4 CUDA && LibTorch

Assume you have installed CUDA, check this link for the CUDA Toolkit version and install here.

For LibTorch, follow libtorch installation, here i choose to use Stable-Linux-LibTorch-C++/Java-CUDA11.8.

For the question:

BEV-LIO-LC/include/REM.hpp:122:107: error: no matching function for call to ‘torch::nn::functional::GridSampleFuncOptions::mode(const torch::enumtype::kBicubic&)’
122 | auto options = torch::nn::functional::GridSampleFuncOptions().align_corners(true).mode(torch::kBicubic);

Change the code at Line 19 in libtorch/include/torch/csrc/api/include/torch/nn/options/vision.h

typedef std::variant<enumtype::kBilinear, enumtype::kNearest, enumtype::kBicubic> mode_t;

Remember to change the path of LibTorch in the CMakeLists.txt to yours.

1.5 Model

We directly use the model from BEVPlace++, check /src/models/tool, you can use turn.py to transform your trained model to .pt format to be used in our codes at src/BEV_LIO/models/tool.

And change the path of resnet_weights.pth at 86 line in REM.hpp.

2.Build

2.1 Build from source

Clone the repository and catkin_make:

    cd ~/$A_ROS_DIR$/src
    git clone https://github.com/HxCa1/BEV-LIO-LC.git
    cd ..
    catkin_make
    source devel/setup.bash
  • Remember to source the livox_ros_driver before build (follow 1.3 livox_ros_driver)
  • If you want to use a custom build of PCL, add the following line to ~/.bashrc export PCL_ROOT={CUSTOM_PCL_PATH}

3. Running

3.1 For MCD

The MCD dataset can be downloaded here.

To run a sequence of ntu:

roslaunch bev_lio_lc mcd_ntu.launch

To run a sequence of kth or tuhh:

roslaunch bev_lio_lc mcd_kth_tuhh.launch

3.2 For NCD

The NCD dataset can be downloaded here.

To run a sequence:

roslaunch bev_lio_lc NCD.launch

3.3 For M2DGR

The M2DGR dataset can be downloaded here.

To run a sequence:

roslaunch bev_lio_lc m2dgr.launch

4.Citation

Please consider citing our work if you find our code or paper useful:

@inproceedings{cai2025bev,
title={BEV-LIO(LC): BEV Image Assisted LiDAR-Inertial Odometry with Loop Closure},
author={Haoxin Cai and Shenghai Yuan and Xinyi Li and Junfeng Guo and Jianqi Liu},
booktitle={Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
year={2025},
address={Hangzhou, China}
}

5.Acknowledgments

Thank the authors of BEVPlace++, FAST-LIO2, COIN-LIO, MapClosures and FAST-LIO-SAM for open-sourcing their outstanding works.

For the problem of FAST-LIO-SAM that crashes during long-term runs, we refer to FAST-LIO-SAM-LOOP.

  • L. Luo, S.-Y. Cao, X. Li, J. Xu, R. Ai, Z. Yu, and X. Chen, “Bevplace++: Fast, robust, and lightweight lidar global localizationfor unmanned ground vehicles,” IEEE Transactions on Robotics (T-RO), 2025.
  • W. Xu, Y. Cai, D. He, J. Lin, and F. Zhang, “Fast-lio2: Fast direct lidar-inertial odometry,” IEEE Transactions on Robotics, vol. 38, no. 4, pp.2053–2073, 2022.
  • P. Pfreundschuh, H. Oleynikova, C. Cadena, R. Siegwart, and O. An-dersson, “Coin-lio: Complementary intensity-augmented lidar inertial odometry,” in 2024 IEEE International Conference on Robotics and Automation (ICRA), 2024, pp. 1730–1737.
  • S. Gupta, T. Guadagnino, B. Mersch, I. Vizzo, and C. Stachniss, “Effectively detecting loop closures using point cloud density maps,” in Proc. of the IEEE Intl. Conf. on Robotics & Automation (ICRA),2024.
  • J. Wang, “Fast-lio-sam: Fast-lio with smoothing and mapping.” https://github.com/kahowang/FAST_LIO_SAM, 2022.

About

BEV Image Assisted LiDAR-Inertial Odometry with Loop Closure

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages