Dataset Preview
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
Job manager crashed while running this job (missing heartbeats).
Error code:   JobManagerCrashedError

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

image
image
End of preview.

SITR Dataset

This repository hosts the dataset for the Sensor-Invariant Tactile Representation (SITR) paper. The dataset supports training and evaluating models for sensor-invariant tactile representations across simulated and real-world settings. The codebase implementing SITR is available on GitHub: SITR Codebase

For more details on the underlying methods and experiments, please visit our project website and read the arXiv paper.


Dataset Overview

The SITR dataset consists of three main parts:

  1. Simulated Tactile Dataset
    A large-scale synthetic dataset generated using physics-based rendering (PBR) in Blender. This dataset spans 100 unique simulated sensor configurations with tactile signals, calibration images, and corresponding surface normal maps. It includes 10K unique contact configurations generated using 50 high-resolution 3D meshes of common household objects, resulting in a pre-training dataset of 1M samples.

  2. Classification Tactile Dataset
    Data collected from 7 real sensors (including variations of GelSight Mini, GelSight Hex, GelSight Wedge, and DIGIT). For the classification task, 20 objects are pressed against each sensor at various poses and depths, accumulating 1K tactile images per object (140K images in total, with 20K per sensor). We decided to only use 16 of the objects for our classification experiments and some of the items were deemed unsuitable (this was decided before experimentation). The dataset is provided as separate train (80%) and test sets (20%).

  3. Pose Estimation Tactile Dataset
    For pose estimation, tactile signals are recorded using a modified Ender-3 Pro 3D printer equipped with 3D-printed indenters. This setup provides accurate ground truth (x, y, z coordinates) for contact points. Data were collected for 6 indenters across 4 sensors, resulting in 1K samples per indentor (24K images in total, 6K per sensor). This dataset is also organized into train (80%) and test sets (20%).


Download and Setup

Simulated Tactile Dataset

The simulated dataset is split into two parts due to its size:

  • renders_part_aa.zip
  • renders_part_ab.zip

You should be able to download both files with

wget https://huggingface.co/datasets/hgupt3/sitr_dataset/resolve/main/renders_part_aa https://huggingface.co/datasets/hgupt3/sitr_dataset/resolve/main/renders_part_ab

To merge and unzip:

  1. Merge the parts into a single zip file:

    cat renders_part_aa renders_part_ab > renders.zip
    

    You can remove the old binaries

    rm renders_part_aa renders_part_ab
    
  2. Unzip the merged file:

    unzip renders.zip -d your_desired_directory
    

Real-World Datasets (Classification & Pose Estimation)

You can download the classificaiton dataset with

wget https://huggingface.co/datasets/hgupt3/sitr_dataset/resolve/main/classification_dataset.zip

and the pose estimation datset with

wget https://huggingface.co/datasets/hgupt3/sitr_dataset/resolve/main/pose_dataset.zip

Simply unzip them in your desired directory:

unzip classification_dataset.zip -d your_desired_directory
unzip pose_dataset.zip -d your_desired_directory

The real-world tactile datasets for classification and pose estimation are provided as separate zip files. Each of these zip files contains two directories:

  • train_set/
  • test_set/

File Structure

Below are examples of the directory trees for each dataset type.

1. Simulated Tactile Dataset

data_root/
β”œβ”€β”€ sensor_0000/
β”‚   β”œβ”€β”€ calibration/          # Calibration images
β”‚   β”‚   β”œβ”€β”€ 0000.png          # Background image
β”‚   β”‚   β”œβ”€β”€ 0001.png
β”‚   β”‚   └── ...
β”‚   β”œβ”€β”€ samples/              # Tactile sample images
β”‚   β”‚   β”œβ”€β”€ 0000.png
β”‚   β”‚   β”œβ”€β”€ 0001.png
β”‚   β”‚   └── ...
β”‚   β”œβ”€β”€ dmaps/                # (Optional) Depth maps
β”‚   β”‚   β”œβ”€β”€ 0000.npy
β”‚   β”‚   └── ...
β”‚   └── norms/                # (Optional) Surface normals
β”‚       β”œβ”€β”€ 0000.npy
β”‚       └── ...
β”œβ”€β”€ sensor_0001/
└── ...

2. Classification Dataset

Each of the train_set/ and test_set/ directories follows this structure:

train_set/  (or test_set/)
β”œβ”€β”€ sensor_0000/
β”‚   β”œβ”€β”€ calibration/          # Calibration images
β”‚   β”œβ”€β”€ samples/              # Organized by class
β”‚   β”‚   β”œβ”€β”€ class_0000/
β”‚   β”‚   β”‚   β”œβ”€β”€ 0000.png
β”‚   β”‚   β”‚   └── ...
β”‚   β”‚   β”œβ”€β”€ class_0001/
β”‚   β”‚   β”‚   β”œβ”€β”€ 0000.png
β”‚   β”‚   β”‚   └── ...
β”‚   β”‚   └── ...
β”œβ”€β”€ sensor_0001/
└── ...

3. Pose Estimation Dataset

Similarly, each of the train_set/ and test_set/ directories is structured as follows:

train_set/  (or test_set/)
β”œβ”€β”€ sensor_0000/
β”‚   β”œβ”€β”€ calibration/          # Calibration images
β”‚   β”œβ”€β”€ samples/              # Tactile sample images
β”‚   β”‚   β”œβ”€β”€ 0000.png
β”‚   β”‚   β”œβ”€β”€ 0001.png
β”‚   β”‚   └── ...
β”‚   └── locations/            # Pose/Location data
β”‚       β”œβ”€β”€ 0000.npy
β”‚       β”œβ”€β”€ 0001.npy
β”‚       └── ...
β”œβ”€β”€ sensor_0001/
└── ...

Citation

If you use this dataset in your research, please cite:

@misc{gupta2025sensorinvarianttactilerepresentation,
    title={Sensor-Invariant Tactile Representation}, 
    author={Harsh Gupta and Yuchen Mo and Shengmiao Jin and Wenzhen Yuan},
    year={2025},
    eprint={2502.19638},
    archivePrefix={arXiv},
    primaryClass={cs.RO},
    url={https://arxiv.org/abs/2502.19638}, 
}

License

This dataset is licensed under the MIT License. See the LICENSE file for details.

If you have any questions or need further clarification, please feel free to reach out.

Downloads last month
116