Repository Summary
Checkout URI | https://github.com/beltransen/velo2cam_calibration.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2022-03-16 |
Dev Status | MAINTAINED |
CI status | No Continuous Integration |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (0)
Good First Issues (0) Pull Requests to Review (0) |
Packages
Name | Version |
---|---|
velo2cam_calibration | 1.0.1 |
README
velo2cam_calibration
The velo2cam_calibration software implements a state-of-the-art automatic calibration algorithm for pair of sensors composed of LiDAR and camera devices in any possible combination, as described in this paper:
Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor Setups
Jorge Beltrán, Carlos Guindel, Arturo de la Escalera, Fernando García
IEEE Transactions on Intelligent Transportation Systems, 2022
[Paper] [Preprint]
Setup
This software is provided as a ROS package. To install:
- Clone the repository into your catkin_ws/src/ folder.
- Install run dependencies:
sudo apt-get install ros-<distro>-opencv-apps
- Build your workspace as usual.
Usage
See HOWTO.md for detailed instructions on how to use this software.
To test the algorithm in a virtual environment, you can launch any of the calibration scenarios included in our Gazebo validation suite.
Calibration target
The following picture shows a possible embodiment of the proposed calibration target used by this algorithm and its corresponding dimensional drawing.
Note: Other size may be used for convenience. If so, please configure node parameters accordingly.
Citation
If you use this work in your research, please consider citing the following paper:
@article{beltran2022,
author={Beltrán, Jorge and Guindel, Carlos and de la Escalera, Arturo and García, Fernando},
journal={IEEE Transactions on Intelligent Transportation Systems},
title={Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor Setups},
year={2022},
doi={10.1109/TITS.2022.3155228}
}
A previous version of this tool is available here and was described on this paper.
CONTRIBUTING
Repository Summary
Checkout URI | https://github.com/beltransen/velo2cam_calibration.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2022-03-16 |
Dev Status | MAINTAINED |
CI status | No Continuous Integration |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (0)
Good First Issues (0) Pull Requests to Review (0) |
Packages
Name | Version |
---|---|
velo2cam_calibration | 1.0.1 |
README
velo2cam_calibration
The velo2cam_calibration software implements a state-of-the-art automatic calibration algorithm for pair of sensors composed of LiDAR and camera devices in any possible combination, as described in this paper:
Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor Setups
Jorge Beltrán, Carlos Guindel, Arturo de la Escalera, Fernando García
IEEE Transactions on Intelligent Transportation Systems, 2022
[Paper] [Preprint]
Setup
This software is provided as a ROS package. To install:
- Clone the repository into your catkin_ws/src/ folder.
- Install run dependencies:
sudo apt-get install ros-<distro>-opencv-apps
- Build your workspace as usual.
Usage
See HOWTO.md for detailed instructions on how to use this software.
To test the algorithm in a virtual environment, you can launch any of the calibration scenarios included in our Gazebo validation suite.
Calibration target
The following picture shows a possible embodiment of the proposed calibration target used by this algorithm and its corresponding dimensional drawing.
Note: Other size may be used for convenience. If so, please configure node parameters accordingly.
Citation
If you use this work in your research, please consider citing the following paper:
@article{beltran2022,
author={Beltrán, Jorge and Guindel, Carlos and de la Escalera, Arturo and García, Fernando},
journal={IEEE Transactions on Intelligent Transportation Systems},
title={Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor Setups},
year={2022},
doi={10.1109/TITS.2022.3155228}
}
A previous version of this tool is available here and was described on this paper.