Datasets:

Modalities:
Tabular
Formats:
parquet
Libraries:
Datasets
Dask
License:
s2ef-15m / README.md
akore's picture
Update README.md
b006f40 verified
metadata
license: apache-2.0
dataset_info:
  features:
    - name: input_ids
      sequence: int16
    - name: coords
      sequence:
        sequence: float32
    - name: forces
      sequence:
        sequence: float32
    - name: formation_energy
      dtype: float32
    - name: total_energy
      dtype: float32
    - name: has_formation_energy
      dtype: bool
    - name: length
      dtype: int64
  splits:
    - name: train
      num_bytes: 43353603080
      num_examples: 15000000
  download_size: 44763791790
  dataset_size: 43353603080
configs:
  - config_name: default
    data_files:
      - split: train
        path: data/train-*

Dataset Description

This dataset contains a collection of 3D atomistic datasets with force and energy labels gathered from a series of sources:

Dataset Structure

Data Instances

For each instance, there is set of atomic numbers (input_ids), 3-D coordinates (coords), a set of forces per atom (forces), the total and formation energy per system (total_energy/formation_energy) and a boolean has_formation_energy that signifies whether the dataset has a valid formation energy.

{'input_ids': [26, 28, 28, 28],
 'coords': [[0.0, 0.0, 0.0],
  [0.0, 0.0, 3.5395920276641846],
  [0.0, 1.7669789791107178, 1.7697960138320923],
  [1.7669789791107178, 0.0, 1.7697960138320923]],
 'forces': [[-1.999999987845058e-08, 2.999999892949745e-08, -0.0],
  [-5.99999978589949e-08, 5.99999978589949e-08, 9.99999993922529e-09],
  [-0.0014535699738189578, 0.0014535400550812483, 9.99999993922529e-09],
  [0.001453649951145053, -0.0014536300441250205, -2.999999892949745e-08]],
 'formation_energy': 0.6030612587928772,
 'total_energy': -25.20570182800293,
 'has_formation_energy': True}

The numbers of atoms within each sample for each dataset varies but the number of samples for each dataset is balanced.
MPtrj and SPICE are upsampled 2x and 3x respectively to ensure a balanced dataset distribution. The datasets are interleaved until we run out of samples where there are 3,160,790 systems from each dataset (2x MPtrj runs out of samples first).

Citation Information

@article{ocp_dataset,
    author = {Chanussot*, Lowik and Das*, Abhishek and Goyal*, Siddharth and Lavril*, Thibaut and Shuaibi*, Muhammed and Riviere, Morgane and Tran, Kevin and Heras-Domingo, Javier and Ho, Caleb and Hu, Weihua and Palizhati, Aini and Sriram, Anuroop and Wood, Brandon and Yoon, Junwoong and Parikh, Devi and Zitnick, C. Lawrence and Ulissi, Zachary},
    title = {Open Catalyst 2020 (OC20) Dataset and Community Challenges},
    journal = {ACS Catalysis},
    year = {2021},
    doi = {10.1021/acscatal.0c04525},
}
@article{oc22_dataset,
    author = {Tran*, Richard and Lan*, Janice and Shuaibi*, Muhammed and Wood*, Brandon and Goyal*, Siddharth and Das, Abhishek and Heras-Domingo, Javier and Kolluru, Adeesh and Rizvi, Ammar and Shoghi, Nima and Sriram, Anuroop and Ulissi, Zachary and Zitnick, C. Lawrence},
    title = {The Open Catalyst 2022 (OC22) dataset and challenges for oxide electrocatalysts},
    journal = {ACS Catalysis},
    year={2023},
}
@article{odac23_dataset,
    author = {Anuroop Sriram and Sihoon Choi and Xiaohan Yu and Logan M. Brabson and Abhishek Das and Zachary Ulissi and Matt Uyttendaele and Andrew J. Medford and David S. Sholl},
    title = {The Open DAC 2023 Dataset and Challenges for Sorbent Discovery in Direct Air Capture},
    year = {2023},
    journal={arXiv preprint arXiv:2311.00341},
}
@article{deng_2023_chgnet,
    author={Deng, Bowen and Zhong, Peichen and Jun, KyuJung and Riebesell, Janosh and Han, Kevin and Bartel, Christopher J. and Ceder, Gerbrand},
    title={CHGNet as a pretrained universal neural network potential for charge-informed atomistic modelling},
    journal={Nature Machine Intelligence},
    year={2023},
    DOI={10.1038/s42256-023-00716-3},
    pages={1–11}
}
@article{eastman2023spice,
  title={Spice, a dataset of drug-like molecules and peptides for training machine learning potentials},
  author={Eastman, Peter and Behara, Pavan Kumar and Dotson, David L and Galvelis, Raimondas and Herr, John E and Horton, Josh T and Mao, Yuezhi and Chodera, John D and Pritchard, Benjamin P and Wang, Yuanqing and others},
  journal={Scientific Data},
  volume={10},
  number={1},
  pages={11},
  year={2023},
  publisher={Nature Publishing Group UK London}
}