Getting Started » Supported Datasets 您所在的位置:网站首页 openvins运行 Getting Started » Supported Datasets

Getting Started » Supported Datasets

2024-05-05 10:06| 来源: 网络整理| 查看: 265

Getting Started » Supported Datasets Contents The EuRoC MAV Dataset TUM Visual-Inertial Dataset RPNG AR Table Dataset RPNG OpenVINS Dataset UZH-FPV Drone Racing Dataset KAIST Urban Dataset KAIST VIO Dataset The EuRoC MAV Dataset

The ETH ASL EuRoC MAV dataset [4] is one of the most used datasets in the visual-inertial / simultaneous localization and mapping (SLAM) research literature. The reason for this is the synchronised inertial+camera sensor data and the high quality groundtruth. The dataset contains different sequences of varying difficulty of a Micro Aerial Vehicle (MAV) flying in an indoor room. Monochrome stereo images are collected by a two Aptina MT9V034 global shutter cameras at 20 frames per seconds, while a ADIS16448 MEMS inertial unit provides linear accelerations and angular velocities at a rate of 200 samples per second.

We recommend that most users start testing on this dataset before moving on to the other datasets that our system support or before trying with your own collected data. The machine hall datasets have the MAV being picked up in the beginning and then set down, we normally skip this part, but it should be able to be handled by the filter if SLAM features are enabled. Please take a look at the run_ros_eth.sh script for some reasonable default values (they might still need to be tuned).

Groundtruth on V1_01_easy

We have found that the groundtruth on the V1_01_easy dataset is not accurate in its orientation estimate. We have recomputed this by optimizing the inertial and vicon readings in a graph to get the trajectory of the imu (refer to our vicon2gt [16] project). You can find the output at this link and is what we normally use to evaluate the error on this dataset.

Dataset NameLength (m)Dataset LinkGroundtruth Traj.ConfigVicon Room 1 0158rosbag, rosbag2linkconfigVicon Room 1 0276rosbag , rosbag2linkconfigVicon Room 1 0379rosbag, rosbag2linkconfigVicon Room 2 0137rosbag, rosbag2linkconfigVicon Room 2 0283rosbag, rosbag2linkconfigVicon Room 2 0386rosbag, rosbag2linkconfigMachine Hall 0180rosbag, rosbag2linkconfigMachine Hall 0273rosbag, rosbag2linkconfigMachine Hall 03131rosbag, rosbag2linkconfigMachine Hall 0492rosbag, rosbag2linkconfigMachine Hall 0598rosbag, rosbag2linkconfigTUM Visual-Inertial Dataset

The TUM Visual-Inertial Dataset [39] is a more recent dataset that was presented to provide a way to evaluate state-of-the-art visual inertial odometry approaches. As compared to the EuRoC MAV datasets, this dataset provides photometric calibration of the cameras which has not been available in any other visual-inertal dataset for researchers. Monochrome stereo images are collected by two IDS uEye UI-3241LE-M-GL global shutter cameras at 20 frames per second, while a Bosch BMI160 inertial unit provides linear accelerations and angular velocities at a rate of 200 samples per second. Not all datasets have groundtruth available throughout the entire trajectory as the motion capture system is limited to the starting and ending room. There are quite a few very challenging outdoor handheld datasets which are a challenging direction for research. Note that we focus on the room datasets as full 6 dof pose collection is available over the total trajectory.

Filter Initialization from Standstill

These datasets have very non-static starts, as they are handheld, and the standstill initialization has issues handling this. Thus careful tuning of the imu initialization threshold is typically needed to ensure that the initialized orientation and the zero velocity assumption are valid. Please take a look at the run_ros_tumvi.sh script for some reasonable default values (they might still need to be tuned). One can enable dynamic initialization to avoid this problem via the init_dyn_use configuration value.

Dataset NameLength (m)Dataset LinkGroundtruth Traj.Configroom1147rosbaglinkconfigroom2142rosbaglinkconfigroom3136rosbaglinkconfigroom469rosbaglinkconfigroom5132rosbaglinkconfigroom667rosbaglinkconfigRPNG AR Table Dataset

The Indoor AR Table Visual-Inertial Datasets [7] were collected to demonstrate the impact of estimating long-term planar surfaces within a visual-inertial estimator. An Intel Realsense D4553 with 30Hz RGB-D (depth was not used) and 400Hz BMI055 IMU along with 100Hz OptiTrack poses were recorded in 1-2 minute segments. The groundtruth was recovered using the vicon2gt utility [16].

Dataset NameLength (m)Dataset LinkSize (GB)Groundtruth Traj.Configtable156rosbag4.77linkconfigtable244rosbag5.54linkconfigtable388rosbag13.19linkconfigtable491rosbag11.49linkconfigtable575rosbag11.66linkconfigtable650rosbag5.26linkconfigtable763rosbag9.02linkconfigtable8125rosbag16.01linkconfigRPNG OpenVINS Dataset

In additional the community maintained datasets, we have also released a few datasets. Please cite the OpenVINS paper if you use any of these datasets in your works. Here are the specifics of the sensors that each dataset uses:

ArUco Datasets:Core visual-inertial sensor is the VI-SensorStereo global shutter images at 20 HzADIS16448 IMU at 200 HzKalibr calibration file can be found hereIronsides Datasets:Core visual-inertial sensor is the ironsidesHas two Reach RTK one subscribed to a base station for correctionsStereo global shutter fisheye images at 20 HzInvenSense IMU at 200 HzGPS fixes at 5 Hz (/reach01/tcpfix has corrections from NYSNet)Kalibr calibration file can be found hereMonocular Camera

Currently there are issues with running with a monocular camera on the Ironside Neighborhood car datasets. This is likely due to the near-constant velocity and "smoothness" of the trajectory. Please refer to [25] and [42] for details.

Most of these datasets do not have perfect calibration parameters, and some are not time synchronised. Thus, please ensure that you have enabled online calibration of these parameters. Additionally, there is no groundtruth for these datasets, but some do include GPS messages if you wish to compare relative to something.

Dataset NameLength (m)Dataset LinkGroundtruth Traj.ConfigArUco Room 0127rosbagnoneconfig arucoArUco Room 0293rosbagnoneconfig arucoArUco Hallway 01190rosbagnoneconfig arucoArUco Hallway 02105rosbagnoneconfig arucoNeighborhood 012300rosbagnoneconfig ironsidesNeighborhood 027400rosbagnoneconfig ironsidesUZH-FPV Drone Racing Dataset

The UZH-FPV Drone Racing Dataset [39] is a dataset focused on high-speed agressive 6dof motion with very high levels of optical flow as compared to other datasets. A FPV drone racing quadrotor has on board a Qualcomm Snapdragon Flight board which can provide inertial measurement and has two 640x480 grayscale global shutter fisheye camera's attached. The groundtruth is collected with a Leica Nova MS60 laser tracker. There are four total sensor configurations and calibration provides including: indoor forward facing stereo, indoor 45 degree stereo, outdoor forward facing, and outdoor 45 degree. A top speed of 12.8 m/s (28 mph) is reached in the indoor scenarios, and 23.4 m/s (54 mphs) is reached in the outdoor datasets. Each of these datasets is picked up in the beginning and then set down, we normally skip this part, but it should be able to be handled by the filter if SLAM features are enabled. Please take a look at the run_ros_uzhfpv.sh script for some reasonable default values (they might still need to be tuned).

Dataset Groundtruthing

Only the Absolute Trajectory Error (ATE) should be used as a metric for this dataset. This is due to inaccurate groundtruth orientation estimates which are explain in their report on the issue. The basic summary is that it is hard to get an accurate orientation information due to the point-based Leica measurements used to groundtruth.

Dataset NameLength (m)Dataset LinkGroundtruth Traj.ConfigIndoor 5157rosbaglinkconfigIndoor 6204rosbaglinkconfigIndoor 7314rosbaglinkconfigIndoor 9136rosbaglinkconfigIndoor 10129rosbaglinkconfigIndoor 45deg 2207rosbaglinkconfigIndoor 45deg 4164rosbaglinkconfigIndoor 45deg 12112rosbaglinkconfigIndoor 45deg 13159rosbaglinkconfigIndoor 45deg 14211rosbaglinkconfigKAIST Urban Dataset

The KAIST urban dataset [23] is a dataset focus on autonomous driving and localization in challenging complex urban environments. The dataset was collected in Korea with a vehicle equipped with stereo camera pair, 2d SICK LiDARs, 3d Velodyne LiDAR, Xsens IMU, fiber optic gyro (FoG), wheel encoders, and RKT GPS. The camera is 10 Hz, while the Xsens IMU is 100 Hz sensing rate. A groundtruth "baseline" trajectory is also provided which is the resulting output from fusion of the FoG, RKT GPS, and wheel encoders. We provide processing scripts to generate the calibration and groundtruth from the dataset's formats.

Dynamic Environments

A challenging open research question is being able to handle dynamic objects seen from the cameras. By default we rely on our tracking 8 point RANSAC to handle these dynamics objects. In the most of the KAIST datasets the majority of the scene can be taken up by other moving vehicles, thus the performance can suffer. Please be aware of this fact.

We recommend converting the KAIST file format into a ROS bag format. If you are using ROS2 then you should first convert into a ROS1 then convert following the ROS1 to ROS2 Bag Conversion Guide . Follow the instructions on the kaist2bag repository:

git clone https://github.com/irapkaist/irp_sen_msgs.git git clone https://github.com/rpng/kaist2bag.gitMonocular Camera

Currently there are issues with running with a monocular camera on this dataset. This is likely due to the near-constant velocity and "smoothness" of the trajectory. Please refer to [25] and [42] for details.

You can also try to use the file_player to publish live. It is important to disable the "skip stop section" to ensure that we have continuous sensor feeds. Typically we process the datasets at 1.5x rate so we get a ~20 Hz image feed and the datasets can be processed in a more efficient manor.

Dataset NameLength (km)Dataset LinkGroundtruth Traj.Example LaunchUrban 2811.47downloadlinkconfigUrban 327.30downloadlinkconfigUrban 3811.42downloadlinkconfigUrban 3911.06downloadlinkconfigKAIST VIO Dataset

The KAIST VIO dataset [22] is a dataset of a MAV in an indoor 3.15 x 3.60 x 2.50 meter environment which undergoes various trajectory motions. The camera is intel realsense D435i 25 Hz, while the IMU is 100 Hz sensing rate from the pixelhawk 4 unit. A groundtruth "baseline" trajectory is also provided from a OptiTrack Mocap system at 50 Hz, the bag files have the marker body frame to IMU frame already applied. This topic has been provided in ov_data for convenience sake.

Dataset NameLength (km)Dataset LinkGroundtruth Traj.Example Launchcircle29.99downloadlinkconfigcircle_fast64.15downloadlinkconfigcircle_head35.05downloadlinkconfiginfinite29.35downloadlinkconfiginfinite_fast54.24downloadlinkconfiginfinite_head37.45downloadlinkconfigrotation7.82downloadlinkconfigrotation_fast14.55downloadlinkconfigsquare41.94downloadlinkconfigsquare_fast44.07downloadlinkconfigsquare_head50.00downloadlinkconfig


【本文地址】

公司简介

联系我们

今日新闻

    推荐新闻

    专题文章
      CopyRight 2018-2019 实验室设备网 版权所有