The Visual-Inertial-Dynamical UAV Dataset

03/20/2021 ∙ by Kunyi Zhang, et al. ∙ Zhejiang University 0

Recently, the community has witnessed numerous datasets built for developing and testing state estimators. However, for some applications such as aerial transportation or search-and-rescue, the contact force or other disturbance must be perceived for robust planning robust control, which is beyond the capacity of these datasets. This paper introduces a Visual-Inertial-Dynamical(VID) dataset, not only focusing on traditional six degrees of freedom (6DOF) pose estimation but also providing dynamical characteristics of the flight platform for external force perception or dynamics-aided estimation. The VID dataset contains hard synchronized imagery and inertial measurements, with accurate ground truth trajectories for evaluating common visual-inertial estimators. Moreover, the proposed dataset highlights the measurements of rotor speed and motor current, dynamical inputs, and ground truth 6-axis force data to evaluate external force estimation. To the best of our knowledge, the proposed VID dataset is the first public dataset containing visual-inertial and complete dynamical information for pose and external force evaluation. The dataset and related open source files are available at <https://github.com/ZJU-FAST-Lab/VID-Dataset>.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 3

Code Repositories

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

I Introduction

As the demand of GPS-denied navigation increases, several datasets [1, 2, 3, 4, 5] for state estimation of aerial robots have been proposed. These datasets typically provide exteroceptive sensing, including imagery and range measurements, and proprioceptive data such as IMUs, for developing onboard state estimation algorithms.

The EUROC dataset [1] pioneers the visual-inertial benchmarking for drones, where a collection of sequences with accurate ground truth are presented. Although the EUROC dataset significantly pushes the boundary of visual SLAM, it only contains motions with moderate speed, thus not applicable to aggressive flights. The Zurich Urban MAV [2] dataset contains long-distance data recorded from a tethered UAV flying in an urban environment but without highly accurate ground truth. The Upenn Fast Flight [3] provides challenging sequences of fast speed and changing illumination but lacks ground truth as well. Recently, UZH proposes a first-person-view(FPV) dataset [4] to challenge the capability of existing state estimation algorithms in extreme conditions. This dataset consists of visual, event, and inertial streams collected in aggressive flights. Besides, the Blackbird UAV dataset [5] also targets high-speed applications. It provides fruitful scenes with photorealistic image rendering and also includes rotor tachometers.

Thanks to the above-mentioned public datasets, state estimation algorithms have been significantly improved in the last decade. Early visual odometry system, such as SVO [6], ORB-SLAM [7] and DSO [8] estimate 6-DoF poses with merely images. Modern IMU-aided visual odometry systems, include MSCKF[9], Vins-mono[10] and Openvins[11], opt to utilize accelerometer and gyroscope as kinematic inputs to get more accurate state estimation.

Fig. 1: Quadrotor platform built for the dataset
EuRoC
MAV
[1]
Zurich
Urban MAV
[2]
UPenn
Fast Flight
[3]
UZH FPV
dataset
[4]
Blackbird
dataset
[5]
Ours
Camera 20 20 40 30/50 120/60 60Hz/60
IMU 200 10 200 500/1000 100 400
Segmentation n/a n/a n/a n/a 60 n/a
mm ground truth 100 n/a n/a 20 360 100
RTK n/a n/a n/a n/a n/a 5
control inputs n/a n/a n/a n/a  190 100
Tachometer n/a n/a n/a n/a n/a 1000
Ammeter n/a n/a n/a n/a n/a 1000
6-axis Force sensor n/a n/a n/a n/a n/a 100
TABLE I: UAV datasets comparison. Frequency are separated as (general image)/(range image). Control inputs mean four motors’ veritable inputs, including target rotor speed and current. Both are measurements of rotor speed and current. Sensor data from a 6-axis Force sensor. Visual environments are synthesized in photorealistic simulation.

However, the above estimators cannot work for applications where unknown dynamics dominate. For instance, in aerial transportation and delivery, drones are required to operate under a heavy payload and thus form a multi-rigid body [12], whose mathematical model needs to be recognized online. In aerial manipulation, a drone equipped with a manipulator [13, 14] also requires precise force feedback to stabilize itself. Moreover, in autonomous flight with severe winds [15, 16], the absence of an accurate external disturbances estimation significantly harming the effectiveness of planning and control. Model-based visual odometry [17] or visual-inertial-dynamics state estimation [18]

, which simultaneously estimates the pose external forces, is a trending research topic emergent in the recent SLAM community. Since a multirotor’s dynamical characteristics can be determined entirely from the rotor model, dynamics is considered a new information source that can further help state estimation, especially for highly aggressive motions. As a byproduct, these estimators also obtain the external force, moment, or disturbance applied to the vehicle in real-time.

Nevertheless, almost no dataset focuses on drone dynamical characteristics to support the researches mentioned above. The Blackbird dataset [5] takes one step forward towards this by providing additional rotor tachometers. However, the dynamical model between rotor tachometers and propeller thrust cannot wholly describe a multirotor’s motion as a rigid body and cannot accurately estimate the external force and moment in real situations. Moreover, the Blackbird dataset provides the desired speed of motors and synthetic visual measurements, which both deviate from actual flights’ status. Besides, even in simulation environments, Blackbird dataset [5] does not contain ground truth data for external forces.

To bridge this gap and lay the foundation for developing a dynamics-aided state estimator for a multirotor, we herein present a dataset with a complete dynamical characteristic of a quadrotor, i.e., the relations between propeller thrust and inverse moment, with rotor speed or motor current. Several different indoor and outdoor sequences are recorded to broaden the application range of multirotors. The distinctive features of the VID dataset compared with others are demonstrated in TABLE I. Besides, there are several unique contributions of this work:

  1. A flight platform with versatile sensors is given, along with fully identified dynamical and inertial parameters.

  2. A sophiscated hardware synchronization scheme for images, IMU, control inputs, rotor speed, and motor current.

  3. A complete public dataset with ground truth measurements of external force and poses.

  4. A set of verifications of the proposed dataset by mainstream state estimation algorithms.

This paper is structured as follows. In Section II, we give a brief introduction to the customized quadrotor platform, which is utilized to collect versatile sensor data. In Section III, we offer essential calibration and dynamics parameters of this flight platform. In Section IV, we evaluate our dataset by conducting several experiments with state-of-the-art algorithms. Finally, we draw a conclusion with known limitations and future extensions of the work.

Ii Datasets

Ii-a Flying platform

Fig. 2: Flight platform with coordinates.

The quadrotor platform(shown in Figure 2) is built upon on the DJI M100111https://www.dji.com/cn/matrice100 with diagonal wheelbase. To get high-precision measurements of motor speed and current, we replace the propulsion system of M100 with the refitted DJI M3508222https://www.robomaster.com/zh-CN/products/components/general/M3508 motors and DJI C620 ESCs. There is a camera333https://www.intelrealsense.com/depth-camera-d435i with depth and stereo image and DJI N3444https://www.dji.com/cn/n3 with IMU mounted on the platform, while the latter is also used as the flight controller.

Here, we list common parameters of the quadrotor and onboard sensors in TABLE II and show the relevant coordinates in the Fig. 2.

Fig. 3: Hardware composition.

The composition of onboard hardware, a customized board with 2 MCUS for motor control, data collection, and time synchronization are shown in Fig. 3.

Ii-B Data sources

Ii-B1 rotor tachometer and current ammeter

The M3508 motor is equipped with a pair of orthogonal Hall sensors, which feedbacks rotor speed measurement, while the C620 ESC samples motor current. Both messages are sent to the CAN bus by the C620 ESC at 1000. The MCU2 monitors the CAN messages and send them to the Manifold2 by UART.

Ii-B2 control input

The DJI N3 flight controller continuously generates control commands and sends them out by PWM waves. The MCU1, which is also mounted on the CAN bus, measures PWM pulse width to get a target rotor speed and then uses a feedforwarded PID to figure out a target motor current send it to the C620 ESCs through CAN bus along with target rotor speed. The MCU2 converts the above messages into UART packs and sends them to Manifold2 on the same CAN bus. The units of target rotor speed and current is (round per minute) and .

Ii-B3 camera and IMU

The IMU is embedded in the DJI N3 at 400, while the imagery and range measurements come from the Intel RealSense D435i up to 90 theoretically.

Ii-B4 6-axis force sensor

As shown in Fig. 4, we attach a 6-axis force (force and moment) sensor, RobotiQ FT300555https://robotiq.com/products/ft-300-force-torque-sensor on the indoor ground, with force measuring range and moment measuring range. In this sequence, we use a rope with one end fixed to the force sensor to pull the drone to get the external force’s data acting on the drone.

Also, the force sensor can be utilized to identify the dynamical characteristics of a propulsion unit.

Fig. 4: Experiment place. (1) Outdoor scene in a parking lot. (2) Indoor scene when recording data.

Ii-C Hardware time synchronization

To guarantee fidelity of the data, all onboard sensors are hardware synchronized. The synchronized devices and dataflow of the proposed scheme are given in the Fig. 3.

The N3 IMU generates a pulse (PPS) and marks the IMU data corresponding to the pulse (one pulse every 400 IMU data). The RealSense camera generates a pulse, which corresponds to the time when the camera starts to expose. The MCU2 monitor the pulses and stamp the detected pulse with its internal time. Before the MCU2 sends motor information message, it also stamps the message with its internal time.

When N3 generates a PPS, Manifold2 receives the pulse detection messages from both IMU and MCU2, then the timeline difference between IMU and MCU2 is calculated. Considering that the image and motor information are stamped with MCU2 time beforehand, we can now mark them with the IMU time.

Prop. Value Description Prop. Value Description
650 Distance wheelbase propeller 13 4.4 Propeller model
1 Noise of tachometer 1.459 Noise density of tachometer
20 /16384 Noise of ammeter 4.950e-2 Noise density of ammeter
0.001 Noise density of gyr. 3.500e-5 Gyr. random walk bias
0.100 Noise density of acc. 4.000e-3 Acc. random walk bias
[0.1, 0.1, 0.1] Noise of force sensor [5e-3, 5e-3, 3e-3] Noise of moment sensor
[1.8875e-1, 1.75e-3, 5.55e-2] Trs. IMU camera [0.0, 0.0, 9.25e-3] Trs. IMU geometric
Image size 640px480px Image width and height baseline 50 Distance between stereo pair
TABLE II: Quadrotor characteristics. Characteristics of rotor tachometer and motor ammeter are statistically calibrated. IMU calibration result by Allan covariance. Dynamical parameters are tested by static experiments. represents translation from A frame to B frame in A frame. Camera extrinsic translation parameter from CAD model. We define the center of the quadrotor as the geometric center with the same direction as the IMU frame and get it from the CAD model.

Ii-D Ground truth

There are various scenarios with ground truth provided by the motion capture system indoors and RTK GNSS outdoors.

As shown in Fig 1, the quadrotor is equipped with an airborne GNSS receiver HITARGET-Sky2666https://www.zhdgps.com which subscribes Qianxun-FindCM777https://mall.qxwz.com/market/services/FindCM service with positioning precision up to 1 cm and orientation precision up to at a frequency of 5 for outdoor pose ground truth. To obtain better localization data, GPS and RTK GNSS are both assembled on the quadrotor.

In indoor environments, we use VICON Vantage and Nexus888https://www.vicon.com/hardware/cameras/vantage to provide accurate ground truth of rotation and translation at .

Ii-E Data collection

All sensor data are collected in the onboard computer under the Robot Operation System (ROS). In addition to the timestamped raw data packaged in ROS, we provide tools to convert data to time-synchronized format.

In addition to dataset raw data streams provided in the form of rosbag, time synchronization conversion tools, and synchronized ROS bag are also provided. Other detailed parameters such as inertial parameters per sequences and hardware configurations are available at https://github.com/ZJU-FAST-Lab/VID-Dataset.

Environment
Feature
Trajectory
Weight
(gram)
Duration
(second)
Imu
Imagery
Dynamical
data
RTK
Ground
truth
Force
sensor
Outdoor Fast, with yaw Rectangle 3547.2 113.21
Slow, with yaw Rectangle 3547.2 175.38
with yaw Round 3541.7 147.34
without yaw Round 3541.7 105.85
without yaw 8-character 3541.7 184.29
without yaw 8-character 3541.4 243.73
Night Fast, with yaw Rectangle 3460.0 133.58
Slow, with yaw Rectangle 3541.6 179.48
without yaw Round 3541.6 82.07
without yaw 8-character 3547.7 121.87
Indoor Loadless Hovor 3096.1 79.04
Round 3096.1 117.89
8-characte 3096.1 109.17
Indoor Loaded Hovor 3372.2 80.11
Round 3372.2 107.15
8-characte 3372.2 138.41
Indoor Rope pulled Random 3101.5 126.53
Random 3101.5 155.38
Motor test Channel 1 45.04
Channel 2 45.08
Channel 3 43.00
Channel 4 43.84
TABLE III: Dataset information

Iii Data validation

Iii-a Calibration

For fusing heterologous sensor data, the temporal and spatial calibration of various sensors, individual characteristics, and inertial parameters of the flight platforms is vital.

We use Kalibr [19] to calibrate the extrinsic parameters between N3 IMU and D435i left gray camera and compare outcomes resulting from calibration algorithms with 3D CAD drawings. And then, Kalibr-allan999https://github.com/ethz-asl/kalibr/wiki/IMU-Noise-Model is utilized to calibrate IMU intrinsic parameters. Besides, we also test the noise characteristics of rotor tachometers and current ammeters.

Inertial parameters by physical measurement are provided in each sequence of the dataset in contrast with 3D CAD drawings. More details could be found on the website.

Iii-B Rotor speed and motor current

As shown in Fig. 5(1)(2), target rotor speed and motor current are very close to the measured. Except when the battery voltage is relatively low, both of them cannot reach the target maximum.

Also, measured motor current has a slight flaw of noise in contrast with target motor current.

Fig. 5: Propulsion system unit test. (1) Comparison between target rotor speed and measured rotor speed; (2) Comparison between target motor current and measure motor current; (3) Comparison between the thrust measured by the force sensor and the thrust fitted from rotor speed; (4) Comparison between the moment measured by the force sensor, the moment fitted by rotor speed and motor current;
Fig. 6: Comparison of relationship between rotor speed and current with measured value and fitting model portrayed, respectively.

Iii-C Dynamic system modeling

Multi-rotor systems usually use propellers with fixed pitch. Here, we need to mount the motor and propeller on the 6-axis force sensor to identify the propulsion unit. This paper models dynamical system and express thrust (unit: ) and moment (unit: ) of a single propeller using the following formulas [20]:

(1)
(2)

where, and represent thrust and moment, which are provided by a single propeller, and , , and mean thrust coefficient, moment coefficients, rotor’s moment of inertia and rotating speed, respectively. It is worth mentioning that the propeller’s thrust and moment are perpendicular to the propeller plane.

Note that, as shown in Fig. 5, it is difficult to guarantee the max rotor speed and motor current due to the arruracy of hardware instalation and voltage drop.

Apart from this, armature current of a single motor is also provided for redundant input resource. According to electrical theory [21], we can build a moment model by the following euqtion:

(3)

where is the current moment coefficient. and represent actual no-load current and no-load moment, respectively. Motor output moment equals to the propeller reverse moment , as long as . In this paper, all the above coefficients are measured in static physical test, which is shown in TABLE IV.

Besides, Figs. 5 and 6 also reveal the relationship of rotor speed and motor current (by low-pass filtering and identification), showing that both of them can be used to calculate thrust and moment.

2.6890-7 -5.6343-9 -9.8316-6 -2.7506-5
2.8190-7 4.7180-9 8.5648-6 2.2204-5
2.7263-7 -5.7012-9 -9.7166-6 -2.7694-5
2.7741-7 4.8260-9 9.8131-6 2.1910-5
TABLE IV: Propeller dynamic model. The number represents the designed quadrotor’s rotor or motor channel, counting from the upper right corner counterclockwise. A negative sign means the opposite direction.

Iv Experiments

We conduct two relevant experiments afterward to verify the effectiveness of the proposed dataset. a) a comparison of state-of-the-art visual-inertial odometry algorithms VINS-Fusion [10], VIMO [17] and VID-Fusion [18] which are currently top open-source vision-based odometry estimation algorithms; b) an additional test of lastest external force estimation algorithms VIMO [17] and our VID-Fusion [18];

Fig. 7: Evaluation of the sequence of outdoor-round-noyaw

.

Iv-a Pose estimation

VINS-Fusion is the currently top open-source IMU-based odometry state estimation algorithm. We compare against two model-based algorithms with loop-closure disabled, on a selected an outdoor sequence from our dataset. It is worth mentioning that, the measured rotor speed and inertial parameter are brought into the pose algorithms as in input value of dynamics.

As an example, we show estimated trajectories with above algorithms against the ground truth in Fig. 7, and find there is no obvious difference in pose estimation.

Iv-B Evaluation metric

We compute the root mean square error (RMSE) over all time indicies of the disturbance component as:

(4)

where, can be 3D external force or moment, or 6-D external generalized force in the world coordinate system. and are measured and estimated value of the external disturbance, respectively.

Iv-C Disturbance estimation

Since there are few available external force estimation algorithms, here we still select VIMO and VID-Fusion to evaluate the disturbance estimation. Likewise, we show the external force estimated by these two algorithms against the ground truth on one sequence in Fig. 8.

VIMO results in a RMSE of for external force estimation, while VID-Fusion obtains a better result as .

Fig. 8: The external force in the rope experiment.

V Discussion and future work

V-a Conclusion

In this paper, we develop a quadrotor platform capable of carrying various time-synchronized sensors. Based on this, we present a novel dataset named the VID UAV dataset for state and external force estimation, safe navigation, robust control, and others. By presenting several experiments, we validate the utility of our proposed dataset.

With this dataset, we sincerely intend to foster scientific researches and industrial applications in relevant fields and hope it will spur the enhancement of relevant algorithms.

V-B limitations

Our dataset still has several known limitations which need to be addressed in near future.

  1. In order to acquit more data about the dynamics, we equip the drone with customized motors, which limits the flexibility and load capacity of the quadrotor due to insufficient motor power.

  2. In this dataset, the parameters provided are not complete. For instance, the thrust coefficients are only measured in static environments, and the rotor drag coefficients are difficult to calibrate in various scenes.

  3. Ground truth data of the external moment is abscent since there is no safe recording environment currently.

V-C Future work

With growing interests in UAV, we wish to extend our dataset further to support more applications under different conditions and with versatile sensors. In the future, we plan to improve the power of drones for equipping more perceptual sensors (such as event camera or lidar), offer more detailed parameters of the system (such as the centroid position and drag coefficients), record more aggressive sequences in different scenes, also include ground truth of the moment which is missing currently.

References

  • [1] M. Burri, J. Nikolic, P. Gohl, T. Schneider, J. Rehder, S. Omari, M. W. Achtelik, and R. Siegwart, “The euroc micro aerial vehicle datasets,” The International Journal of Robotics Research, vol. 35, no. 10, pp. 1157–1163, 2016.
  • [2] A. L. Majdik, C. Till, and D. Scaramuzza, “The zurich urban micro aerial vehicle dataset,” The International Journal of Robotics Research, vol. 36, no. 3, pp. 269–273, 2017.
  • [3] K. Sun, K. Mohta, B. Pfrommer, M. Watterson, S. Liu, Y. Mulgaonkar, C. J. Taylor, and V. Kumar, “Robust stereo visual inertial odometry for fast autonomous flight,” IEEE Robotics and Automation Letters, vol. 3, no. 2, pp. 965–972, 2018.
  • [4] J. Delmerico, T. Cieslewski, H. Rebecq, M. Faessler, and D. Scaramuzza, “Are we ready for autonomous drone racing? the uzh-fpv drone racing dataset,” in 2019 International Conference on Robotics and Automation (ICRA).   IEEE, 2019, pp. 6713–6719.
  • [5] A. Antonini, W. Guerra, V. Murali, T. Sayre-McCord, and S. Karaman, “The blackbird uav dataset,” The International Journal of Robotics Research, p. 0278364920908331, 2020.
  • [6] C. Forster, M. Pizzoli, and D. Scaramuzza, “Svo: Fast semi-direct monocular visual odometry,” in 2014 IEEE international conference on robotics and automation (ICRA).   IEEE, 2014, pp. 15–22.
  • [7] R. Mur-Artal, J. M. M. Montiel, and J. D. Tardos, “Orb-slam: a versatile and accurate monocular slam system,” IEEE transactions on robotics, vol. 31, no. 5, pp. 1147–1163, 2015.
  • [8] J. Engel, V. Koltun, and D. Cremers, “Direct sparse odometry,” IEEE transactions on pattern analysis and machine intelligence, vol. 40, no. 3, pp. 611–625, 2017.
  • [9] M. Li and A. I. Mourikis, “High-precision, consistent ekf-based visual-inertial odometry,” The International Journal of Robotics Research, vol. 32, no. 6, pp. 690–711, 2013.
  • [10] T. Qin, P. Li, and S. Shen, “Vins-mono: A robust and versatile monocular visual-inertial state estimator,” IEEE Transactions on Robotics, vol. 34, no. 4, pp. 1004–1020, 2018.
  • [11] P. Geneva, K. Eckenhoff, W. Lee, Y. Yang, and G. Huang, “Openvins: A research platform for visual-inertial estimation,” in 2020 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2020, pp. 4666–4672.
  • [12] C. Y. Son, H. Seo, D. Jang, and H. J. Kim, “Real-time optimal trajectory generation and control of a multi-rotor with a suspended load for obstacle avoidance,” IEEE Robotics and Automation Letters, vol. 5, no. 2, pp. 1915–1922, 2020.
  • [13] S. Kim, S. Choi, H. Kim, J. Shin, H. Shim, and H. J. Kim, “Robust control of an equipment-added multirotor using disturbance observer,” IEEE Transactions on Control Systems Technology, vol. 26, no. 4, pp. 1524–1531, 2018.
  • [14] D. Lee, D. Jang, H. Seo, and H. Jin Kim, “Model predictive control for an aerial manipulator opening a hinged door,” in 2019 19th International Conference on Control, Automation and Systems (ICCAS), 2019, pp. 986–991.
  • [15] H. Seo, D. Lee, C. Y. Son, C. J. Tomlin, and H. J. Kim, “Robust trajectory planning for a multirotor against disturbance based on hamilton-jacobi reachability analysis,” in 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2019, pp. 3150–3157.
  • [16] J. Ji, X. Zhou, C. Xu, and F. Gao, “Cmpcc: Corridor-based model predictive contouring control for aggressive drone flight,” arXiv preprint arXiv:2007.03271, 2020.
  • [17] B. Nisar, P. Foehn, D. Falanga, and D. Scaramuzza, “Vimo: Simultaneous visual inertial model-based odometry and force estimation,” IEEE Robotics and Automation Letters, vol. 4, no. 3, pp. 2785–2792, 2019.
  • [18] Z. Ding, T. Yang, K. Zhang, C. Xu, and F. Gao, “Vid-fusion: Robust visual-inertial-dynamics odometry for accurate external force estimation,” arXiv preprint arXiv:2011.03993, 2020.
  • [19] P. Furgale, J. Rehder, and R. Siegwart, “Unified temporal and spatial calibration for multi-sensor systems,” in 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2013, pp. 1280–1286.
  • [20] J. Svacha, J. Paulos, G. Loianno, and V. Kumar, “Imu-based inertia estimation for a quadrotor using newton-euler dynamics,” IEEE Robotics and Automation Letters, vol. 5, no. 3, pp. 3861–3867, 2020.
  • [21] S. Chapman, Electric Machinery Fundamentals.   McGraw-Hill, 2012.