File size: 4,300 Bytes
af713cc
 
e3d724f
 
af713cc
e949415
deaf366
e5ebd9b
e3d724f
 
 
d7530b8
e3d724f
deaf366
e3d724f
d7530b8
deaf366
 
 
 
 
 
 
 
 
58b84ea
 
deaf366
 
58b84ea
22f3e9b
deaf366
 
707eeed
 
deaf366
 
22f3e9b
 
58b84ea
deaf366
 
 
 
58b84ea
 
deaf366
22f3e9b
58b84ea
deaf366
 
 
e3d724f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
52e8e3f
e3d724f
707eeed
e3d724f
 
 
 
 
 
52e8e3f
 
 
 
 
 
 
 
 
e3d724f
 
57ca36b
e3d724f
 
 
 
 
57ca36b
 
 
 
 
 
9ba4f3c
e3d724f
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
---
license: mit
tags:
- Machine Learning Interatomic Potential
---

# Model Card for mace-universal

[MACE](https://github.com/ACEsuit/mace) (Multiple Atomic Cluster Expansion) is a machine learning interatomic potential (MLIP) with higher order equivariant message passing. For more information about MACE formalism, please see authors' [paper](https://arxiv.org/abs/2206.07697). 


[2023-08-14-mace-universal.model](https://huggingface.co/cyrusyc/mace-universal/blob/main/2023-08-14-mace-universal.model) was trained with MPTrj data, [Materials Project](https://materialsproject.org) relaxation trajectories compiled by [CHGNet](https://arxiv.org/abs/2302.14231) authors to cover 89 elements and 1.6M configurations. The checkpoint was used for materials stability prediction on [Matbench Discovery](https://matbench-discovery.materialsproject.org/) and the associated [preprint](https://arXiv.org/abs/2308.14920).

# Usage

1. (optional) Install Pytorch, [ASE](https://wiki.fysik.dtu.dk/ase/) prerequisites for preferred version
2. Install [MACE](https://github.com/ACEsuit/mace) through GitHub (not through pypi)

```shell
pip install git+https://github.com/ACEsuit/mace.git
```
3. Use MACECalculator

```python
from mace.calculators import MACECalculator
from ase import Atoms, units
from ase.build import bulk
from ase.md.npt import NPT

atoms = bulk("NaCl", crystalstructure='rocksalt', a=3.54, cubic=True)

calculator = MACECalculator(
  model_paths=/path/to/pretrained.model,
  device=device,
  default_dtype="float32" or "float64",
)

atoms.calc = calculator

dyn = NPT(
  atoms=atoms,
  timestep=timestep,
  temperature_K=temperature,
  externalstress=externalstress,
  ttime=ttime,
  pfactor=pfactor,
)

dyn.run(steps)
```

# Citing

If you use the pretrained models in this repository, please cite all the following:

```
@inproceedings{Batatia2022mace,
  title={{MACE}: Higher Order Equivariant Message Passing Neural Networks for Fast and Accurate Force Fields},
  author={Ilyes Batatia and David Peter Kovacs and Gregor N. C. Simm and Christoph Ortner and Gabor Csanyi},
  booktitle={Advances in Neural Information Processing Systems},
  editor={Alice H. Oh and Alekh Agarwal and Danielle Belgrave and Kyunghyun Cho},
  year={2022},
  url={https://openreview.net/forum?id=YPpSngE-ZU}
}

@article{riebesell2023matbench,
  title={Matbench Discovery--An evaluation framework for machine learning crystal stability prediction},
  author={Riebesell, Janosh and Goodall, Rhys EA and Jain, Anubhav and Benner, Philipp and Persson, Kristin A and Lee, Alpha A},
  journal={arXiv preprint arXiv:2308.14920},
  year={2023}
}


@misc {yuan_chiang_2023,
  author       = { {Yuan Chiang, Philipp Benner} },
  title        = { mace-universal (Revision e5ebd9b) },
  year         = 2023,
  url          = { https://huggingface.co/cyrusyc/mace-universal },
  doi          = { 10.57967/hf/1202 },
  publisher    = { Hugging Face }
}

@article{deng2023chgnet,
  title={CHGNet as a pretrained universal neural network potential for charge-informed atomistic modelling},
  author={Deng, Bowen and Zhong, Peichen and Jun, KyuJung and Riebesell, Janosh and Han, Kevin and Bartel, Christopher J and Ceder, Gerbrand},
  journal={Nature Machine Intelligence},
  pages={1--11},
  year={2023},
  publisher={Nature Publishing Group UK London}
}
```

# Training Guide

## Training Data

<!-- This should link to a Data Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->

For now, please download MPTrj data from [figshare](https://figshare.com/articles/dataset/Materials_Project_Trjectory_MPtrj_Dataset/23713842). We may upload to HuggingFace Datasets in the future.

## Fine-tuning

<!-- This should link to a Training Procedure Card, perhaps with a short stub of information on what the training procedure is all about as well as documentation related to hyperparameters or additional training details. -->

We provide an example multi-GPU training script [2023-08-14-mace-universal.sbatch](https://huggingface.co/cyrusyc/mace-universal/blob/main/2023-08-14-mace-universal.sbatch), which uses 40 A100s on NERSC Perlmutter. Please see MACE `multi-gpu` branch for more detailed instructions.