File size: 3,345 Bytes
af713cc
 
e3d724f
 
af713cc
e949415
deaf366
e5ebd9b
e3d724f
 
 
52e8e3f
e3d724f
deaf366
e3d724f
deaf366
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e3d724f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
52e8e3f
e3d724f
 
 
 
 
 
 
 
52e8e3f
 
 
 
 
 
 
 
 
e3d724f
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
---
license: mit
tags:
- Machine Learning Interatomic Potential
---

# Model Card for mace-universal

[MACE](https://github.com/ACEsuit/mace) (Multiple Atomic Cluster Expansion) is a machine learning interatomic potential (MLIP) with higher order equivariant message passing. For more information about MACE formalism, please see authors' [paper](https://arxiv.org/abs/2206.07697). 


[2023-08-14-mace-universal.model](https://huggingface.co/cyrusyc/mace-universal/blob/main/2023-08-14-mace-universal.model) was trained with MPTrj data, [Materials Project](https://materialsproject.org) relaxation trajectories compiled by [CHGNet](https://arxiv.org/abs/2302.14231) authors to cover 89 elements and 1.6M configurations. The checkpoint was used for materials stability prediction in [Matbench Discovery](https://matbench-discovery.materialsproject.org/) and the corresponding [preprint](https://arXiv.org/abs/2308.14920).

# Usage

1. (optional) Install Pytorch, [ASE](https://wiki.fysik.dtu.dk/ase/) prerequisites for specific version
2. Install [MACE](https://github.com/ACEsuit/mace) through GitHub (not through pypi)

```shell
pip install git+https://github.com/ACEsuit/mace.git
```
3. Use MACECalculator

```python
from mace.calculators import MACECalculator
from ase.md.npt import NPT

calculator = MACECalculator(
  model_paths=/path/to/pretrained.model,
  device=device
)

nvt = NPT(
  atoms=atoms,
  timestep=timestep,
  temperature_K=temperature,
  externalstress=externalstress,
)
```

# Citing

If you use the pretrained models in this repository, please cite all the following:

```
@inproceedings{Batatia2022mace,
  title={{MACE}: Higher Order Equivariant Message Passing Neural Networks for Fast and Accurate Force Fields},
  author={Ilyes Batatia and David Peter Kovacs and Gregor N. C. Simm and Christoph Ortner and Gabor Csanyi},
  booktitle={Advances in Neural Information Processing Systems},
  editor={Alice H. Oh and Alekh Agarwal and Danielle Belgrave and Kyunghyun Cho},
  year={2022},
  url={https://openreview.net/forum?id=YPpSngE-ZU}
}

@article{riebesell2023matbench,
  title={Matbench Discovery--An evaluation framework for machine learning crystal stability prediction},
  author={Riebesell, Janosh and Goodall, Rhys EA and Jain, Anubhav and Benner, Philipp and Persson, Kristin A and Lee, Alpha A},
  journal={arXiv preprint arXiv:2308.14920},
  year={2023}
}


@misc {yuan_chiang_2023,
  author       = { {Yuan Chiang} },
  title        = { mace-universal (Revision e5ebd9b) },
  year         = 2023,
  url          = { https://huggingface.co/cyrusyc/mace-universal },
  doi          = { 10.57967/hf/1202 },
  publisher    = { Hugging Face }
}

@article{deng2023chgnet,
  title={CHGNet as a pretrained universal neural network potential for charge-informed atomistic modelling},
  author={Deng, Bowen and Zhong, Peichen and Jun, KyuJung and Riebesell, Janosh and Han, Kevin and Bartel, Christopher J and Ceder, Gerbrand},
  journal={Nature Machine Intelligence},
  pages={1--11},
  year={2023},
  publisher={Nature Publishing Group UK London}
}
```

# Training Details

## Training Data

<!-- This should link to a Data Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->


## Training Procedure