Datasets:

DOI:
License:
mateosss commited on
Commit
7f424a2
·
1 Parent(s): 47812ee

Write readme

Browse files
Files changed (1) hide show
  1. README.md +224 -4
README.md CHANGED
@@ -4,7 +4,227 @@ license: cc-by-4.0
4
 
5
  # Monado SLAM Datasets
6
 
7
- As of now these are permissively licensed datasets for visual-inertial
8
- egocentric tracking with focus on the XR use-case. They were recorded on a Valve
9
- Index VR headset with ground truth provided from lighthouse base stations and
10
- SteamVR. More technical information about the dataset is coming soon.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4
 
5
  # Monado SLAM Datasets
6
 
7
+ The [Monado SLAM datasets
8
+ (MSD)](https://huggingface.co/datasets/collabora/monado-slam-datasets), are
9
+ egocentric visual-inertial SLAM datasets recorded for improving the
10
+ [Basalt](https://gitlab.com/VladyslavUsenko/basalt)-based inside-out tracking
11
+ component of the [Monado](https://monado.dev) project. These have a permissive
12
+ license [CC-BY 4.0](http://creativecommons.org/licenses/by/4.0/), meaning you
13
+ can use them for any purpose you want, including commercial, and only a mention
14
+ of the original project is required. The creation of these datasets was
15
+ supported by [Collabora](https://collabora.com)
16
+
17
+ Monado is an open-source OpenXR runtime that you can use to make devices OpenXR
18
+ compatible. It also provides drivers for different existing hardware thanks to
19
+ different contributors in the community creating drivers for it. Monado provides
20
+ different XR-related modules that these drivers can use. To be more specific,
21
+ inside-out head tracking is one of those modules and, while you can use
22
+ different tracking systems, the main system is a [fork of
23
+ Basalt](https://gitlab.freedesktop.org/mateosss/basalt). Creating a good
24
+ open-source tracking solution requires a solid measurement pipeline to
25
+ understand how changes in the system affect tracking quality. For this reason,
26
+ the creation of these datasets was essential.
27
+
28
+ These datasets are very specific to the XR use-case as they contain VI-SLAM
29
+ footage recorded from devices such as VR headsets but other devices like phones
30
+ or AR glasses might be added in the future. These were made since current SLAM
31
+ datasets like EuRoC or TUM-VI were not specific enough for XR, or they didn't
32
+ have permissively enough usage licenses.
33
+
34
+ For questions or comments you can use the Hugging Face
35
+ [Community](https://huggingface.co/datasets/collabora/monado-slam-datasets/discussions),
36
+ join Monado's discord [server](https://discord.gg/8RkJgRJ) and ask in the
37
+ `#slam` channel, or send an email to <[email protected]>.
38
+
39
+ # Valve Index datasets
40
+
41
+ These datasets were recorded using a Valve Index with the `vive` driver in
42
+ Monado and they have groundtruth from 3 lighthouses tracking the headset through
43
+ the proprietary OpenVR implementation provided by SteamVR. The exact commit used
44
+ in Monado at the time of recording is
45
+ [a4e7765d](https://gitlab.freedesktop.org/mateosss/monado/-/commit/a4e7765d7219b06a0c801c7bb33f56d3ea69229d).
46
+ The datasets are in the ASL dataset format, the same as the [EuRoC
47
+ datasets](https://projects.asl.ethz.ch/datasets/doku.php?id=kmavvisualinertialdatasets).
48
+ Besides the main EuRoC format files we provide some extra files with raw
49
+ timestamp data for exploring realtime timestamp alignment techniques.
50
+
51
+ The dataset is postprocessed to reduce as much as possible special treatment
52
+ from SLAM systems: camera-IMU and groundtruth-IMU timestamp alignment, IMU
53
+ alignment and bias calibration has been applied, lighthouse tracked pose has
54
+ been converted to IMU pose and so on. Most of the post processing was done with
55
+ Basalt
56
+ [calibration](https://gitlab.com/VladyslavUsenko/basalt/-/blob/master/doc/Calibration.md?ref_type=heads#camera-imu-mocap-calibration)
57
+ and
58
+ [alignment](https://gitlab.com/VladyslavUsenko/basalt/-/blob/master/doc/Realsense.md?ref_type=heads#generating-time-aligned-ground-truth)
59
+ tools, as well as the
60
+ [xrtslam-metrics](https://gitlab.freedesktop.org/mateosss/xrtslam-metrics)
61
+ scripts for Monado tracking. The postprocessing process is documented in [this
62
+ video][postprocessing-video] which goes through making the [MIPB08] dataset ready
63
+ for use starting from its raw version.
64
+
65
+ ## Data
66
+
67
+ ### Camera samples
68
+
69
+ In the `vive` driver from Monado we don't have direct access to the camera
70
+ device timestamps but only to V4L2 timestamps. These are not exactly hardware
71
+ timestamps and have some offset with respect to the device clock in which the
72
+ IMU samples are timestamped.
73
+
74
+ The camera frames can be found in the `camX/data` directory as PNG files with
75
+ names corresponding to the their V4L2 timestamps. The `camX/data.csv` file
76
+ contains aligned timestamp of each frame. The `camX/data.extra.csv` also
77
+ contains the original V4L2 timestamp and the "host timestamp" which is the time
78
+ at which the host computer had the frame ready to use after USB transmission. By
79
+ separating arrival time and exposure time algorithms can be made to be more
80
+ robust for real-time operation.
81
+
82
+ The cameras of the Valve Index are global shutter with a resolution of 960x960
83
+ streaming at 54fps. They have autoexposure enabled. While the cameras of the
84
+ Index are RGB you will find only grayscale images in these datasets. The
85
+ original images are provided in YUYV422 format but only the luma component is
86
+ stored.
87
+
88
+ For each dataset, the camera timestamps are aligned with respect to IMU
89
+ timestamps by running visual-only odometry with Basalt on a 30-second subset of
90
+ the dataset. The resulting trajectory is then aligned with the
91
+ [`basalt_time_alignment`](https://gitlab.com/VladyslavUsenko/basalt/-/blob/master/doc/Realsense.md?ref_type=heads#generating-time-aligned-ground-truth)
92
+ tool that aligns the rotational velocities of the trajectory with the gyroscope
93
+ samples and returns the resulting offset in nanoseconds. That correction is then
94
+ applied to the dataset. Refer to the postprocessing walkthrough
95
+ [video][postprocessing-video] for more details.
96
+
97
+ ### IMU samples
98
+
99
+ The IMU timestamps are device timestamps, they come at about 1000Hz. We provide
100
+ an `imu0/data.raw.csv` file that contains the raw measurements without any axis
101
+ scale-misalignment nor bias correction. `imu0/data.csv` has the
102
+ scale-misalignment and bias corrections applied so that the SLAM system can
103
+ ignore those corrections. `imu0/data.extra.csv` contains the arrival time of the
104
+ IMU sample to the host computer for algorithms that want to adapt themselves to
105
+ work on real-time.
106
+
107
+ ### Groundtruth information
108
+
109
+ The groundtruth setup consists of three lighthouses 2.0 base stations and a
110
+ SteamVR session providing tracking data through the OpenVR API to Monado. While
111
+ not as precise as a other MoCap tracking systems like OptiTrack or Vicon it
112
+ should still provide pretty good accuracy and precision close to the 1mm range.
113
+ There are different attempts at studying the accuracy of SteamVR tracking that
114
+ you can checkout like
115
+ [this](https://dl.acm.org/doi/pdf/10.1145/3463914.3463921),
116
+ [this](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7956487/pdf/sensors-21-01622.pdf),
117
+ or [this](http://doc-ok.org/?p=1478). When a tracking system gets closer to
118
+ milimiter accuracy these datasets will no longer be as useful for improving it.
119
+
120
+ The raw groundtruth data is stored in `gt/data.raw.csv`. OpenVR does not provide
121
+ timestamps and as such, the timestamps recorded are from when the host asks
122
+ OpenVR for the latest pose with a call to
123
+ [`GetDeviceToAbsoluteTrackingPose`](https://github.com/ValveSoftware/openvr/wiki/IVRSystem::GetDeviceToAbsoluteTrackingPose).
124
+ The poses contained in this file are not of the IMU but of the headset origin as
125
+ interpreted by SteamVR, which usually is between the middle of the eyes and
126
+ facing towards the displays. The file `gt/data.csv` corrects each entry of the
127
+ previous file with timestamps aligned with the IMU clock and poses of the IMU
128
+ instead of this headset origin.
129
+
130
+ ### Calibration
131
+
132
+ There are multiple calibration datasets in the
133
+ [`MIC_calibration`](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIC_calibration)
134
+ directory. There are camera-focused and IMU-focused calibration datasets. See
135
+ the [README.md
136
+ there](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/MIC_calibration/README.md)
137
+ for more information on what each sequence is.
138
+
139
+ In the [`MI_valve_index/extras`](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/extras) directory you can find the following files:
140
+
141
+ - [`calibration.json`](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/extras/calibration.json):
142
+ Calibration file produced with the
143
+ [`basalt_calibrate_imu`](https://gitlab.com/VladyslavUsenko/basalt/-/blob/master/doc/Calibration.md?ref_type=heads#camera-imu-mocap-calibration)
144
+ tool from
145
+ [`MIC01_camcalib1`](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/MIC_calibration/MIC01_camcalib1.zip)
146
+ and
147
+ [`MIC04_imucalib1`](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/MIC_calibration/MIC04_imucalib1.zip)
148
+ datasets with camera-IMU time offset and IMU bias/misalignment info removed so
149
+ that it works with the fully the all the datasets by default which are fully
150
+ postprocessed and don't require those fields.
151
+ - [`calibration.extra.json`](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/extras/calibration.extra.json):
152
+ Same as `calibration.json` but with the cam-IMU time offset and IMU bias and
153
+ misalignment information filled in.
154
+ - [`factory.json`](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/extras/factory.json):
155
+ JSON exposed by the headset used for recording with information from factory
156
+ that include calibration and other data. It's not used for anything but might
157
+ be of interest.
158
+ - [`other_calibrations/`](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/extras/other_calibrations):
159
+ Results from calibrating using the other datasets for comparisson and checking
160
+ most of them are similar. `MICXX_camcalibY` have camera only calibration
161
+ produced with the
162
+ [`basalt_calibrate`](https://gitlab.com/VladyslavUsenko/basalt/-/blob/master/doc/Calibration.md?ref_type=heads#camera-calibration)
163
+ tool, while the corresponding `MICXX_imucalibY` datasets use these datasets as
164
+ a starting point and have the `basalt_calibrate_imu` calibration results.
165
+
166
+ #### Camera model
167
+
168
+ By default, the `calibration.json` file provides parameters `k1`, `k2`, `k3`,
169
+ and `k4` for the [Kannala-Brandt camera
170
+ model](https://vladyslavusenko.gitlab.io/basalt-headers/classbasalt_1_1KannalaBrandtCamera4.html#a423a4f1255e9971fe298dc6372345681)
171
+ with fish-eye distortion (also known as [OpenCV's
172
+ fish-eye](https://docs.opencv.org/3.4/db/d58/group__calib3d__fisheye.html#details)).
173
+
174
+ Calibrations with other camera models might be added later on, otherwise you can
175
+ use the calibration sequences for custom calibrations.
176
+
177
+ #### IMU model
178
+
179
+ For the default `calibration.json` where all parameters are zero you can ignore
180
+ any model and just use the measurements present in `imu0/data.csv` directly. If
181
+ instead you want to use the raw measurements from `imu0/data.raw.csv` you will
182
+ need to apply the Basalt
183
+ [accelerometer](https://vladyslavusenko.gitlab.io/basalt-headers/classbasalt_1_1CalibAccelBias.html#details)
184
+ and
185
+ [gyroscope](https://vladyslavusenko.gitlab.io/basalt-headers/classbasalt_1_1CalibGyroBias.html#details)
186
+ models that uses a misalignment-scale correction matrix together with a constant
187
+ initial bias. The random walk and white noise parameters were not computed and
188
+ default reasonable values are used instead.
189
+
190
+ ### Post-processing walkthrough
191
+
192
+ If you are interested in understanding the step-by-step procedure of
193
+ postprocessing of the dataset, below is a video detailing the procedure for the
194
+ [MIPB08] dataset.
195
+
196
+ [![Post-processing walkthrough video](https://img.youtube.com/vi/0PX_6PNwrvQ/0.jpg)](https://www.youtube.com/watch?v=0PX_6PNwrvQ)
197
+
198
+ ## Sequences
199
+
200
+ - [MIC_calibration](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIC_calibration):
201
+ Calibration sequences recording
202
+ [this](https://drive.google.com/file/d/1DqKWgePodCpAKJCd_Bz-hfiEQOSnn_k0)
203
+ calibration target from Kalibr with the squares of the target having sides of
204
+ 3cm. Some sequeneces are focused on camera calibration covering the image
205
+ planes of both stereo cameras while others on IMU calibration properly
206
+ exciting all six components of the IMU.
207
+ - [MIP_playing](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIO_others): Datasets in which
208
+ the user is playing a particular VR game on SteamVR while Monado records
209
+ the datasets.
210
+ - [MIPB_beat_saber](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIP_playing/MIPB_beat_saber):
211
+ This contains different songs played at different speeds. The fitbeat song
212
+ is one that requires a lot of head movement while [MIPB08] is a long 40min
213
+ dataset with many levels played.
214
+ - [MIPP_pistol_whip](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIP_playing/MIPP_pistol_whip):
215
+ This is a shooting and music game, each dataset is a different level/song.
216
+ - [MIPT_thrill_of_the_fight](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIP_playing/MIPT_thrill_of_the_fight):
217
+ This is a boxing game.
218
+ - [MIO_others](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIO_others): These are other
219
+ datasets that might be useful, they include playpretend scenarios in which
220
+ the user supposed to be playing some particular game, then there is some
221
+ inspection and scanning/mapping of the room, some very short and
222
+ lightweight datasets for quick testing, and some datasets with a lot of
223
+ movement around the environment.
224
+
225
+ ## License
226
+ This work is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>.
227
+ <a rel="license" href="http://creativecommons.org/licenses/by/4.0/"><img alt="Creative Commons License" style="border-width:0" src="https://i.creativecommons.org/l/by/4.0/88x31.png" /></a>
228
+
229
+ [postprocessing-video]: https://youtu.be/0PX_6PNwrvQ
230
+ [MIPB08]: https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIP_playing/MIPB_beat_saber