mace-universal / README.md
cyrusyc's picture
Update README.md
deaf366
|
raw
history blame
3.35 kB
metadata
license: mit
tags:
  - Machine Learning Interatomic Potential

Model Card for mace-universal

MACE (Multiple Atomic Cluster Expansion) is a machine learning interatomic potential (MLIP) with higher order equivariant message passing. For more information about MACE formalism, please see authors' paper.

2023-08-14-mace-universal.model was trained with MPTrj data, Materials Project relaxation trajectories compiled by CHGNet authors to cover 89 elements and 1.6M configurations. The checkpoint was used for materials stability prediction in Matbench Discovery and the corresponding preprint.

Usage

  1. (optional) Install Pytorch, ASE prerequisites for specific version
  2. Install MACE through GitHub (not through pypi)
pip install git+https://github.com/ACEsuit/mace.git
  1. Use MACECalculator
from mace.calculators import MACECalculator
from ase.md.npt import NPT

calculator = MACECalculator(
  model_paths=/path/to/pretrained.model,
  device=device
)

nvt = NPT(
  atoms=atoms,
  timestep=timestep,
  temperature_K=temperature,
  externalstress=externalstress,
)

Citing

If you use the pretrained models in this repository, please cite all the following:

@inproceedings{Batatia2022mace,
  title={{MACE}: Higher Order Equivariant Message Passing Neural Networks for Fast and Accurate Force Fields},
  author={Ilyes Batatia and David Peter Kovacs and Gregor N. C. Simm and Christoph Ortner and Gabor Csanyi},
  booktitle={Advances in Neural Information Processing Systems},
  editor={Alice H. Oh and Alekh Agarwal and Danielle Belgrave and Kyunghyun Cho},
  year={2022},
  url={https://openreview.net/forum?id=YPpSngE-ZU}
}

@article{riebesell2023matbench,
  title={Matbench Discovery--An evaluation framework for machine learning crystal stability prediction},
  author={Riebesell, Janosh and Goodall, Rhys EA and Jain, Anubhav and Benner, Philipp and Persson, Kristin A and Lee, Alpha A},
  journal={arXiv preprint arXiv:2308.14920},
  year={2023}
}


@misc {yuan_chiang_2023,
  author       = { {Yuan Chiang} },
  title        = { mace-universal (Revision e5ebd9b) },
  year         = 2023,
  url          = { https://huggingface.co/cyrusyc/mace-universal },
  doi          = { 10.57967/hf/1202 },
  publisher    = { Hugging Face }
}

@article{deng2023chgnet,
  title={CHGNet as a pretrained universal neural network potential for charge-informed atomistic modelling},
  author={Deng, Bowen and Zhong, Peichen and Jun, KyuJung and Riebesell, Janosh and Han, Kevin and Bartel, Christopher J and Ceder, Gerbrand},
  journal={Nature Machine Intelligence},
  pages={1--11},
  year={2023},
  publisher={Nature Publishing Group UK London}
}

Training Details

Training Data

Training Procedure