Datasets:

DOI:
License:
mateosss commited on
Commit
ce74dfc
·
1 Parent(s): 308f3d7

Fix README typos

Browse files
Files changed (1) hide show
  1. README.md +66 -64
README.md CHANGED
@@ -8,7 +8,7 @@ license: cc-by-4.0
8
 
9
  The [Monado SLAM datasets
10
  (MSD)](https://huggingface.co/datasets/collabora/monado-slam-datasets), are
11
- egocentric visual-inertial SLAM datasets recorded for improving the
12
  [Basalt](https://gitlab.com/VladyslavUsenko/basalt)-based inside-out tracking
13
  component of the [Monado](https://monado.dev) project. These have a permissive
14
  license [CC-BY 4.0](http://creativecommons.org/licenses/by/4.0/), meaning you
@@ -27,13 +27,13 @@ open-source tracking solution requires a solid measurement pipeline to
27
  understand how changes in the system affect tracking quality. For this reason,
28
  the creation of these datasets was essential.
29
 
30
- These datasets are very specific to the XR use-case as they contain VI-SLAM
31
- footage recorded from devices such as VR headsets but other devices like phones
32
  or AR glasses might be added in the future. These were made since current SLAM
33
  datasets like EuRoC or TUM-VI were not specific enough for XR, or they didn't
34
  have permissively enough usage licenses.
35
 
36
- For questions or comments you can use the Hugging Face
37
  [Community](https://huggingface.co/datasets/collabora/monado-slam-datasets/discussions),
38
  join Monado's discord [server](https://discord.gg/8RkJgRJ) and ask in the
39
  `#slam` channel, or send an email to <[email protected]>.
@@ -76,7 +76,7 @@ join Monado's discord [server](https://discord.gg/8RkJgRJ) and ask in the
76
  - [MIO15_moving_person_props](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/MIO_others/MIO15_moving_person_props.zip): <details style="display: inline;cursor: pointer;user-select: none"><summary>Preview 5x</summary><video width="320" height="320" controls preload="none"><source src="https://huggingface.co/datasets/collabora/monado-slam-datasets/resolve/main/M_monado_datasets/MI_valve_index/extras/previews/MIO15_moving_person_props.webm" type="video/webm"/>Video tag not supported.</video></details>
77
  - [MIO16_moving_screens_person_props](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/MIO_others/MIO16_moving_screens_person_props.zip): <details style="display: inline;cursor: pointer;user-select: none"><summary>Preview 5x</summary><video width="320" height="320" controls preload="none"><source src="https://huggingface.co/datasets/collabora/monado-slam-datasets/resolve/main/M_monado_datasets/MI_valve_index/extras/previews/MIO16_moving_screens_person_props.webm" type="video/webm"/>Video tag not supported.</video></details>
78
  - [MIP_playing](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIP_playing)
79
- - [MIPB_beat_sabet](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIP_playing/MIPB_beat_saber)
80
  - [MIPB01_beatsaber_100bills_360_normal](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/MIP_playing/MIPB_beat_saber/MIPB01_beatsaber_100bills_360_normal.zip): <details style="display: inline;cursor: pointer;user-select: none"><summary>Preview 5x</summary><video width="320" height="320" controls preload="none"><source src="https://huggingface.co/datasets/collabora/monado-slam-datasets/resolve/main/M_monado_datasets/MI_valve_index/extras/previews/MIPB01_beatsaber_100bills_360_normal.webm" type="video/webm"/>Video tag not supported.</video></details>
81
  - [MIPB02_beatsaber_crabrave_360_hard](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/MIP_playing/MIPB_beat_saber/MIPB02_beatsaber_crabrave_360_hard.zip): <details style="display: inline;cursor: pointer;user-select: none"><summary>Preview 5x</summary><video width="320" height="320" controls preload="none"><source src="https://huggingface.co/datasets/collabora/monado-slam-datasets/resolve/main/M_monado_datasets/MI_valve_index/extras/previews/MIPB02_beatsaber_crabrave_360_hard.webm" type="video/webm"/>Video tag not supported.</video></details>
82
  - [MIPB03_beatsaber_countryrounds_360_expert](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/MIP_playing/MIPB_beat_saber/MIPB03_beatsaber_countryrounds_360_expert.zip): <details style="display: inline;cursor: pointer;user-select: none"><summary>Preview 5x</summary><video width="320" height="320" controls preload="none"><source src="https://huggingface.co/datasets/collabora/monado-slam-datasets/resolve/main/M_monado_datasets/MI_valve_index/extras/previews/MIPB03_beatsaber_countryrounds_360_expert.webm" type="video/webm"/>Video tag not supported.</video></details>
@@ -100,48 +100,48 @@ join Monado's discord [server](https://discord.gg/8RkJgRJ) and ask in the
100
  ## Valve Index datasets
101
 
102
  These datasets were recorded using a Valve Index with the `vive` driver in
103
- Monado and they have groundtruth from 3 lighthouses tracking the headset through
104
  the proprietary OpenVR implementation provided by SteamVR. The exact commit used
105
  in Monado at the time of recording is
106
  [a4e7765d](https://gitlab.freedesktop.org/mateosss/monado/-/commit/a4e7765d7219b06a0c801c7bb33f56d3ea69229d).
107
  The datasets are in the ASL dataset format, the same as the [EuRoC
108
  datasets](https://projects.asl.ethz.ch/datasets/doku.php?id=kmavvisualinertialdatasets).
109
- Besides the main EuRoC format files we provide some extra files with raw
110
- timestamp data for exploring realtime timestamp alignment techniques.
111
 
112
- The dataset is postprocessed to reduce as much as possible special treatment
113
- from SLAM systems: camera-IMU and groundtruth-IMU timestamp alignment, IMU
114
- alignment and bias calibration has been applied, lighthouse tracked pose has
115
- been converted to IMU pose and so on. Most of the post processing was done with
116
  Basalt
117
  [calibration](https://gitlab.com/VladyslavUsenko/basalt/-/blob/master/doc/Calibration.md?ref_type=heads#camera-imu-mocap-calibration)
118
  and
119
  [alignment](https://gitlab.com/VladyslavUsenko/basalt/-/blob/master/doc/Realsense.md?ref_type=heads#generating-time-aligned-ground-truth)
120
  tools, as well as the
121
  [xrtslam-metrics](https://gitlab.freedesktop.org/mateosss/xrtslam-metrics)
122
- scripts for Monado tracking. The postprocessing process is documented in [this
123
- video][postprocessing-video] which goes through making the [MIPB08] dataset ready
124
  for use starting from its raw version.
125
 
126
  ### Data
127
 
128
  #### Camera samples
129
 
130
- In the `vive` driver from Monado we don't have direct access to the camera
131
  device timestamps but only to V4L2 timestamps. These are not exactly hardware
132
  timestamps and have some offset with respect to the device clock in which the
133
  IMU samples are timestamped.
134
 
135
  The camera frames can be found in the `camX/data` directory as PNG files with
136
- names corresponding to the their V4L2 timestamps. The `camX/data.csv` file
137
- contains aligned timestamp of each frame. The `camX/data.extra.csv` also
138
- contains the original V4L2 timestamp and the "host timestamp" which is the time
139
- at which the host computer had the frame ready to use after USB transmission. By
140
- separating arrival time and exposure time algorithms can be made to be more
141
- robust for real-time operation.
142
-
143
- The cameras of the Valve Index are global shutter with a resolution of 960x960
144
- streaming at 54fps. They have autoexposure enabled. While the cameras of the
145
  Index are RGB you will find only grayscale images in these datasets. The
146
  original images are provided in YUYV422 format but only the luma component is
147
  stored.
@@ -152,33 +152,33 @@ the dataset. The resulting trajectory is then aligned with the
152
  [`basalt_time_alignment`](https://gitlab.com/VladyslavUsenko/basalt/-/blob/master/doc/Realsense.md?ref_type=heads#generating-time-aligned-ground-truth)
153
  tool that aligns the rotational velocities of the trajectory with the gyroscope
154
  samples and returns the resulting offset in nanoseconds. That correction is then
155
- applied to the dataset. Refer to the postprocessing walkthrough
156
- [video][postprocessing-video] for more details.
157
 
158
  #### IMU samples
159
 
160
  The IMU timestamps are device timestamps, they come at about 1000Hz. We provide
161
  an `imu0/data.raw.csv` file that contains the raw measurements without any axis
162
- scale-misalignment nor bias correction. `imu0/data.csv` has the
163
- scale-misalignment and bias corrections applied so that the SLAM system can
164
  ignore those corrections. `imu0/data.extra.csv` contains the arrival time of the
165
  IMU sample to the host computer for algorithms that want to adapt themselves to
166
- work on real-time.
167
 
168
- #### Groundtruth information
169
 
170
- The groundtruth setup consists of three lighthouses 2.0 base stations and a
171
  SteamVR session providing tracking data through the OpenVR API to Monado. While
172
- not as precise as a other MoCap tracking systems like OptiTrack or Vicon it
173
  should still provide pretty good accuracy and precision close to the 1mm range.
174
  There are different attempts at studying the accuracy of SteamVR tracking that
175
- you can checkout like
176
  [this](https://dl.acm.org/doi/pdf/10.1145/3463914.3463921),
177
  [this](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7956487/pdf/sensors-21-01622.pdf),
178
  or [this](http://doc-ok.org/?p=1478). When a tracking system gets closer to
179
- milimiter accuracy these datasets will no longer be as useful for improving it.
180
 
181
- The raw groundtruth data is stored in `gt/data.raw.csv`. OpenVR does not provide
182
  timestamps and as such, the timestamps recorded are from when the host asks
183
  OpenVR for the latest pose with a call to
184
  [`GetDeviceToAbsoluteTrackingPose`](https://github.com/ValveSoftware/openvr/wiki/IVRSystem::GetDeviceToAbsoluteTrackingPose).
@@ -193,11 +193,13 @@ instead of this headset origin.
193
  There are multiple calibration datasets in the
194
  [`MIC_calibration`](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIC_calibration)
195
  directory. There are camera-focused and IMU-focused calibration datasets. See
196
- the [README.md
197
- there](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/MIC_calibration/README.md)
198
- for more information on what each sequence is.
199
 
200
- In the [`MI_valve_index/extras`](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/extras) directory you can find the following files:
 
 
201
 
202
  - [`calibration.json`](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/extras/calibration.json):
203
  Calibration file produced with the
@@ -208,18 +210,18 @@ In the [`MI_valve_index/extras`](https://huggingface.co/datasets/collabora/monad
208
  [`MIC04_imucalib1`](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/MIC_calibration/MIC04_imucalib1.zip)
209
  datasets with camera-IMU time offset and IMU bias/misalignment info removed so
210
  that it works with the fully the all the datasets by default which are fully
211
- postprocessed and don't require those fields.
212
  - [`calibration.extra.json`](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/extras/calibration.extra.json):
213
  Same as `calibration.json` but with the cam-IMU time offset and IMU bias and
214
  misalignment information filled in.
215
  - [`factory.json`](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/extras/factory.json):
216
- JSON exposed by the headset used for recording with information from factory
217
- that include calibration and other data. It's not used for anything but might
218
- be of interest.
219
  - [`other_calibrations/`](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/extras/other_calibrations):
220
- Results from calibrating using the other datasets for comparisson and checking
221
- most of them are similar. `MICXX_camcalibY` have camera only calibration
222
- produced with the
223
  [`basalt_calibrate`](https://gitlab.com/VladyslavUsenko/basalt/-/blob/master/doc/Calibration.md?ref_type=heads#camera-calibration)
224
  tool, while the corresponding `MICXX_imucalibY` datasets use these datasets as
225
  a starting point and have the `basalt_calibrate_imu` calibration results.
@@ -232,26 +234,26 @@ model](https://vladyslavusenko.gitlab.io/basalt-headers/classbasalt_1_1KannalaBr
232
  with fish-eye distortion (also known as [OpenCV's
233
  fish-eye](https://docs.opencv.org/3.4/db/d58/group__calib3d__fisheye.html#details)).
234
 
235
- Calibrations with other camera models might be added later on, otherwise you can
236
- use the calibration sequences for custom calibrations.
237
 
238
  ##### IMU model
239
 
240
- For the default `calibration.json` where all parameters are zero you can ignore
241
  any model and just use the measurements present in `imu0/data.csv` directly. If
242
- instead you want to use the raw measurements from `imu0/data.raw.csv` you will
243
  need to apply the Basalt
244
  [accelerometer](https://vladyslavusenko.gitlab.io/basalt-headers/classbasalt_1_1CalibAccelBias.html#details)
245
  and
246
  [gyroscope](https://vladyslavusenko.gitlab.io/basalt-headers/classbasalt_1_1CalibGyroBias.html#details)
247
- models that uses a misalignment-scale correction matrix together with a constant
248
  initial bias. The random walk and white noise parameters were not computed and
249
  default reasonable values are used instead.
250
 
251
  #### Post-processing walkthrough
252
 
253
  If you are interested in understanding the step-by-step procedure of
254
- postprocessing of the dataset, below is a video detailing the procedure for the
255
  [MIPB08] dataset.
256
 
257
  [![Post-processing walkthrough video](https://img.youtube.com/vi/0PX_6PNwrvQ/0.jpg)](https://www.youtube.com/watch?v=0PX_6PNwrvQ)
@@ -259,15 +261,15 @@ postprocessing of the dataset, below is a video detailing the procedure for the
259
  ### Sequences
260
 
261
  - [MIC_calibration](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIC_calibration):
262
- Calibration sequences recording
263
  [this](https://drive.google.com/file/d/1DqKWgePodCpAKJCd_Bz-hfiEQOSnn_k0)
264
  calibration target from Kalibr with the squares of the target having sides of
265
- 3cm. Some sequeneces are focused on camera calibration covering the image
266
  planes of both stereo cameras while others on IMU calibration properly
267
  exciting all six components of the IMU.
268
- - [MIP_playing](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIO_others): Datasets in which
269
- the user is playing a particular VR game on SteamVR while Monado records
270
- the datasets.
271
  - [MIPB_beat_saber](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIP_playing/MIPB_beat_saber):
272
  This contains different songs played at different speeds. The fitbeat song
273
  is one that requires a lot of head movement while [MIPB08] is a long 40min
@@ -276,17 +278,17 @@ postprocessing of the dataset, below is a video detailing the procedure for the
276
  This is a shooting and music game, each dataset is a different level/song.
277
  - [MIPT_thrill_of_the_fight](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIP_playing/MIPT_thrill_of_the_fight):
278
  This is a boxing game.
279
- - [MIO_others](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIO_others): These are other
280
- datasets that might be useful, they include playpretend scenarios in which
281
- the user supposed to be playing some particular game, then there is some
282
- inspection and scanning/mapping of the room, some very short and
283
- lightweight datasets for quick testing, and some datasets with a lot of
284
- movement around the environment.
285
 
286
  ## License
287
 
288
  This work is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>.
289
  <a rel="license" href="http://creativecommons.org/licenses/by/4.0/"><img alt="Creative Commons License" style="border-width:0" src="https://i.creativecommons.org/l/by/4.0/88x31.png" /></a>
290
 
291
- [postprocessing-video]: https://youtu.be/0PX_6PNwrvQ
292
  [MIPB08]: https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIP_playing/MIPB_beat_saber
 
8
 
9
  The [Monado SLAM datasets
10
  (MSD)](https://huggingface.co/datasets/collabora/monado-slam-datasets), are
11
+ egocentric visual-inertial SLAM datasets recorded to improve the
12
  [Basalt](https://gitlab.com/VladyslavUsenko/basalt)-based inside-out tracking
13
  component of the [Monado](https://monado.dev) project. These have a permissive
14
  license [CC-BY 4.0](http://creativecommons.org/licenses/by/4.0/), meaning you
 
27
  understand how changes in the system affect tracking quality. For this reason,
28
  the creation of these datasets was essential.
29
 
30
+ These datasets are very specific to the XR use case as they contain VI-SLAM
31
+ footage recorded from devices such as VR headsets, but other devices like phones
32
  or AR glasses might be added in the future. These were made since current SLAM
33
  datasets like EuRoC or TUM-VI were not specific enough for XR, or they didn't
34
  have permissively enough usage licenses.
35
 
36
+ For questions or comments, you can use the Hugging Face
37
  [Community](https://huggingface.co/datasets/collabora/monado-slam-datasets/discussions),
38
  join Monado's discord [server](https://discord.gg/8RkJgRJ) and ask in the
39
  `#slam` channel, or send an email to <[email protected]>.
 
76
  - [MIO15_moving_person_props](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/MIO_others/MIO15_moving_person_props.zip): <details style="display: inline;cursor: pointer;user-select: none"><summary>Preview 5x</summary><video width="320" height="320" controls preload="none"><source src="https://huggingface.co/datasets/collabora/monado-slam-datasets/resolve/main/M_monado_datasets/MI_valve_index/extras/previews/MIO15_moving_person_props.webm" type="video/webm"/>Video tag not supported.</video></details>
77
  - [MIO16_moving_screens_person_props](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/MIO_others/MIO16_moving_screens_person_props.zip): <details style="display: inline;cursor: pointer;user-select: none"><summary>Preview 5x</summary><video width="320" height="320" controls preload="none"><source src="https://huggingface.co/datasets/collabora/monado-slam-datasets/resolve/main/M_monado_datasets/MI_valve_index/extras/previews/MIO16_moving_screens_person_props.webm" type="video/webm"/>Video tag not supported.</video></details>
78
  - [MIP_playing](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIP_playing)
79
+ - [MIPB_beat_saber](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIP_playing/MIPB_beat_saber)
80
  - [MIPB01_beatsaber_100bills_360_normal](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/MIP_playing/MIPB_beat_saber/MIPB01_beatsaber_100bills_360_normal.zip): <details style="display: inline;cursor: pointer;user-select: none"><summary>Preview 5x</summary><video width="320" height="320" controls preload="none"><source src="https://huggingface.co/datasets/collabora/monado-slam-datasets/resolve/main/M_monado_datasets/MI_valve_index/extras/previews/MIPB01_beatsaber_100bills_360_normal.webm" type="video/webm"/>Video tag not supported.</video></details>
81
  - [MIPB02_beatsaber_crabrave_360_hard](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/MIP_playing/MIPB_beat_saber/MIPB02_beatsaber_crabrave_360_hard.zip): <details style="display: inline;cursor: pointer;user-select: none"><summary>Preview 5x</summary><video width="320" height="320" controls preload="none"><source src="https://huggingface.co/datasets/collabora/monado-slam-datasets/resolve/main/M_monado_datasets/MI_valve_index/extras/previews/MIPB02_beatsaber_crabrave_360_hard.webm" type="video/webm"/>Video tag not supported.</video></details>
82
  - [MIPB03_beatsaber_countryrounds_360_expert](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/MIP_playing/MIPB_beat_saber/MIPB03_beatsaber_countryrounds_360_expert.zip): <details style="display: inline;cursor: pointer;user-select: none"><summary>Preview 5x</summary><video width="320" height="320" controls preload="none"><source src="https://huggingface.co/datasets/collabora/monado-slam-datasets/resolve/main/M_monado_datasets/MI_valve_index/extras/previews/MIPB03_beatsaber_countryrounds_360_expert.webm" type="video/webm"/>Video tag not supported.</video></details>
 
100
  ## Valve Index datasets
101
 
102
  These datasets were recorded using a Valve Index with the `vive` driver in
103
+ Monado and they have ground truth from 3 lighthouses tracking the headset through
104
  the proprietary OpenVR implementation provided by SteamVR. The exact commit used
105
  in Monado at the time of recording is
106
  [a4e7765d](https://gitlab.freedesktop.org/mateosss/monado/-/commit/a4e7765d7219b06a0c801c7bb33f56d3ea69229d).
107
  The datasets are in the ASL dataset format, the same as the [EuRoC
108
  datasets](https://projects.asl.ethz.ch/datasets/doku.php?id=kmavvisualinertialdatasets).
109
+ Besides the main EuRoC format files, we provide some extra files with raw
110
+ timestamp data for exploring real time timestamp alignment techniques.
111
 
112
+ The dataset is post-processed to reduce as much as possible special treatment
113
+ from SLAM systems: camera-IMU and ground truth-IMU timestamp alignment, IMU
114
+ alignment and bias calibration have been applied, lighthouse tracked pose has
115
+ been converted to IMU pose, and so on. Most of the post-processing was done with
116
  Basalt
117
  [calibration](https://gitlab.com/VladyslavUsenko/basalt/-/blob/master/doc/Calibration.md?ref_type=heads#camera-imu-mocap-calibration)
118
  and
119
  [alignment](https://gitlab.com/VladyslavUsenko/basalt/-/blob/master/doc/Realsense.md?ref_type=heads#generating-time-aligned-ground-truth)
120
  tools, as well as the
121
  [xrtslam-metrics](https://gitlab.freedesktop.org/mateosss/xrtslam-metrics)
122
+ scripts for Monado tracking. The post-processing process is documented in [this
123
+ video][post-processing-video] which goes through making the [MIPB08] dataset ready
124
  for use starting from its raw version.
125
 
126
  ### Data
127
 
128
  #### Camera samples
129
 
130
+ In the `vive` driver from Monado, we don't have direct access to the camera
131
  device timestamps but only to V4L2 timestamps. These are not exactly hardware
132
  timestamps and have some offset with respect to the device clock in which the
133
  IMU samples are timestamped.
134
 
135
  The camera frames can be found in the `camX/data` directory as PNG files with
136
+ names corresponding to their V4L2 timestamps. The `camX/data.csv` file contains
137
+ aligned timestamps of each frame. The `camX/data.extra.csv` also contains the
138
+ original V4L2 timestamp and the "host timestamp" which is the time at which the
139
+ host computer had the frame ready to use after USB transmission. By separating
140
+ arrival time and exposure time algorithms can be made to be more robust for
141
+ real time operation.
142
+
143
+ The cameras of the Valve Index have global shutters with a resolution of 960×960
144
+ streaming at 54fps. They have auto exposure enabled. While the cameras of the
145
  Index are RGB you will find only grayscale images in these datasets. The
146
  original images are provided in YUYV422 format but only the luma component is
147
  stored.
 
152
  [`basalt_time_alignment`](https://gitlab.com/VladyslavUsenko/basalt/-/blob/master/doc/Realsense.md?ref_type=heads#generating-time-aligned-ground-truth)
153
  tool that aligns the rotational velocities of the trajectory with the gyroscope
154
  samples and returns the resulting offset in nanoseconds. That correction is then
155
+ applied to the dataset. Refer to the post-processing walkthrough
156
+ [video][post-processing-video] for more details.
157
 
158
  #### IMU samples
159
 
160
  The IMU timestamps are device timestamps, they come at about 1000Hz. We provide
161
  an `imu0/data.raw.csv` file that contains the raw measurements without any axis
162
+ scale misalignment o bias correction. `imu0/data.csv` has the
163
+ scale misalignment and bias corrections applied so that the SLAM system can
164
  ignore those corrections. `imu0/data.extra.csv` contains the arrival time of the
165
  IMU sample to the host computer for algorithms that want to adapt themselves to
166
+ work in real time.
167
 
168
+ #### Ground truth information
169
 
170
+ The ground truth setup consists of three lighthouses 2.0 base stations and a
171
  SteamVR session providing tracking data through the OpenVR API to Monado. While
172
+ not as precise as other MoCap tracking systems like OptiTrack or Vicon it
173
  should still provide pretty good accuracy and precision close to the 1mm range.
174
  There are different attempts at studying the accuracy of SteamVR tracking that
175
+ you can check out like
176
  [this](https://dl.acm.org/doi/pdf/10.1145/3463914.3463921),
177
  [this](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7956487/pdf/sensors-21-01622.pdf),
178
  or [this](http://doc-ok.org/?p=1478). When a tracking system gets closer to
179
+ millimeter accuracy these datasets will no longer be as useful for improving it.
180
 
181
+ The raw ground truth data is stored in `gt/data.raw.csv`. OpenVR does not provide
182
  timestamps and as such, the timestamps recorded are from when the host asks
183
  OpenVR for the latest pose with a call to
184
  [`GetDeviceToAbsoluteTrackingPose`](https://github.com/ValveSoftware/openvr/wiki/IVRSystem::GetDeviceToAbsoluteTrackingPose).
 
193
  There are multiple calibration datasets in the
194
  [`MIC_calibration`](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIC_calibration)
195
  directory. There are camera-focused and IMU-focused calibration datasets. See
196
+ the
197
+ [README.md](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/MIC_calibration/README.md)
198
+ file in there for more information on what each sequence is.
199
 
200
+ In the
201
+ [`MI_valve_index/extras`](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/extras)
202
+ directory you can find the following files:
203
 
204
  - [`calibration.json`](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/extras/calibration.json):
205
  Calibration file produced with the
 
210
  [`MIC04_imucalib1`](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/MIC_calibration/MIC04_imucalib1.zip)
211
  datasets with camera-IMU time offset and IMU bias/misalignment info removed so
212
  that it works with the fully the all the datasets by default which are fully
213
+ post-processed and don't require those fields.
214
  - [`calibration.extra.json`](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/extras/calibration.extra.json):
215
  Same as `calibration.json` but with the cam-IMU time offset and IMU bias and
216
  misalignment information filled in.
217
  - [`factory.json`](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MI_valve_index/extras/factory.json):
218
+ JSON file exposed by the headset's firmware with information of the device. It
219
+ includes camera and display calibration as well as more data that might be of
220
+ interest. It is not used but included for completeness' sake.
221
  - [`other_calibrations/`](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/extras/other_calibrations):
222
+ Calibration results obtained from the other calibration datasets. Shown for
223
+ comparison and ensuring that all of them have similar values.
224
+ `MICXX_camcalibY` has camera-only calibration produced with the
225
  [`basalt_calibrate`](https://gitlab.com/VladyslavUsenko/basalt/-/blob/master/doc/Calibration.md?ref_type=heads#camera-calibration)
226
  tool, while the corresponding `MICXX_imucalibY` datasets use these datasets as
227
  a starting point and have the `basalt_calibrate_imu` calibration results.
 
234
  with fish-eye distortion (also known as [OpenCV's
235
  fish-eye](https://docs.opencv.org/3.4/db/d58/group__calib3d__fisheye.html#details)).
236
 
237
+ Calibrations with other camera models might be added later on, otherwise, you
238
+ can use the calibration sequences for custom calibrations.
239
 
240
  ##### IMU model
241
 
242
+ For the default `calibration.json` where all parameters are zero, you can ignore
243
  any model and just use the measurements present in `imu0/data.csv` directly. If
244
+ instead, you want to use the raw measurements from `imu0/data.raw.csv` you will
245
  need to apply the Basalt
246
  [accelerometer](https://vladyslavusenko.gitlab.io/basalt-headers/classbasalt_1_1CalibAccelBias.html#details)
247
  and
248
  [gyroscope](https://vladyslavusenko.gitlab.io/basalt-headers/classbasalt_1_1CalibGyroBias.html#details)
249
+ models that use a misalignment-scale correction matrix together with a constant
250
  initial bias. The random walk and white noise parameters were not computed and
251
  default reasonable values are used instead.
252
 
253
  #### Post-processing walkthrough
254
 
255
  If you are interested in understanding the step-by-step procedure of
256
+ post-processing of the dataset, below is a video detailing the procedure for the
257
  [MIPB08] dataset.
258
 
259
  [![Post-processing walkthrough video](https://img.youtube.com/vi/0PX_6PNwrvQ/0.jpg)](https://www.youtube.com/watch?v=0PX_6PNwrvQ)
 
261
  ### Sequences
262
 
263
  - [MIC_calibration](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIC_calibration):
264
+ Calibration sequences that record
265
  [this](https://drive.google.com/file/d/1DqKWgePodCpAKJCd_Bz-hfiEQOSnn_k0)
266
  calibration target from Kalibr with the squares of the target having sides of
267
+ 3 cm. Some sequences are focused on camera calibration covering the image
268
  planes of both stereo cameras while others on IMU calibration properly
269
  exciting all six components of the IMU.
270
+ - [MIP_playing](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIO_others):
271
+ Datasets in which the user is playing a particular VR game on SteamVR while
272
+ Monado records the datasets.
273
  - [MIPB_beat_saber](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIP_playing/MIPB_beat_saber):
274
  This contains different songs played at different speeds. The fitbeat song
275
  is one that requires a lot of head movement while [MIPB08] is a long 40min
 
278
  This is a shooting and music game, each dataset is a different level/song.
279
  - [MIPT_thrill_of_the_fight](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIP_playing/MIPT_thrill_of_the_fight):
280
  This is a boxing game.
281
+ - [MIO_others](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIO_others):
282
+ These are other datasets that might be useful, they include play-pretend
283
+ scenarios in which the user is supposed to be playing some particular game,
284
+ then there is some inspection and scanning/mapping of the room, some very
285
+ short and lightweight datasets for quick testing, and some datasets with a lot
286
+ of movement around the environment.
287
 
288
  ## License
289
 
290
  This work is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>.
291
  <a rel="license" href="http://creativecommons.org/licenses/by/4.0/"><img alt="Creative Commons License" style="border-width:0" src="https://i.creativecommons.org/l/by/4.0/88x31.png" /></a>
292
 
293
+ [post-processing-video]: https://youtu.be/0PX_6PNwrvQ
294
  [MIPB08]: https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/MIP_playing/MIPB_beat_saber