Validation of an IMU-camera fusion algorithm using an industrial robot
DOI:
https://doi.org/10.33414/rtyc.37.101-111.2020Keywords:
sensor fusion, inertial measurement unit, monocular camera, industrial robot, error-state Kalman filterAbstract
The integration of down-looking camera with an in-ertial measurement unit (IMU) sensor makes possible to provide a lightweight and low-cost pose estimation system for unmanned aerial vehicles (UAVs) and micro-UAVs (MAVs). Recently, the authors developed an algorithm for IMU and exteroceptive sensor fusion filter for position and orientation estimation. The aim of the estimation is to be used in the outer control loop of an UAV for position control. This work presents an experimental set up to test that algorithm using an industrial robot to produce accurate planar trajectories as a safe alternative to testing the algorithm on real UAVs. The results of the IMU-camera fusion estimation for linear positions and linear velocities show an error admissible to be integrated on real UAVs.