cyrusyc commited on
Commit
e3d724f
1 Parent(s): e5ebd9b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +48 -3
README.md CHANGED
@@ -1,8 +1,53 @@
1
  ---
2
  license: mit
 
 
3
  ---
4
 
5
- [MACE](https://github.com/ACEsuit/mace) (Multiple Atomic Cluster Expansion) is a machine learning interatomic potential (MLIP) with higher order equivariant message passing. For more information about MACE formalism, please see authors' [paper](https://arxiv.org/abs/2206.07697). If you use the pretrained models in this repository, please cite all the following:
6
 
7
- - Batatia, Ilyes, et al. "MACE: Higher order equivariant message passing neural networks for fast and accurate force fields." Advances in Neural Information Processing Systems 35 (2022): 11423-11436.
8
- - Riebesell, Janosh, et al. "Matbench Discovery--An evaluation framework for machine learning crystal stability prediction." arXiv preprint arXiv:2308.14920 (2023).
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: mit
3
+ tags:
4
+ - Machine Learning Interatomic Potential
5
  ---
6
 
7
+ # Model Card for mace-unversal
8
 
9
+ [MACE](https://github.com/ACEsuit/mace) (Multiple Atomic Cluster Expansion) is a machine learning interatomic potential (MLIP) with higher order equivariant message passing. For more information about MACE formalism, please see authors' [paper](https://arxiv.org/abs/2206.07697).
10
+
11
+
12
+ [2023-08-14-mace-universal.model](https://huggingface.co/cyrusyc/mace-universal/blob/main/2023-08-14-mace-universal.model) was trained with MPTrj data, [Materials Project](https://materialsproject.org) relaxation trajectories convering 89 elements and 1.6M configurations. The checkpoint was used for materials stability prediction in [Matbench Discovery](https://matbench-discovery.materialsproject.org/) and the corresponding [preprint](https://arXiv.org/abs/2308.14920).
13
+
14
+
15
+ # Citation
16
+
17
+ If you use the pretrained models in this repository, please cite all the following:
18
+
19
+ ```
20
+ @inproceedings{Batatia2022mace,
21
+ title={{MACE}: Higher Order Equivariant Message Passing Neural Networks for Fast and Accurate Force Fields},
22
+ author={Ilyes Batatia and David Peter Kovacs and Gregor N. C. Simm and Christoph Ortner and Gabor Csanyi},
23
+ booktitle={Advances in Neural Information Processing Systems},
24
+ editor={Alice H. Oh and Alekh Agarwal and Danielle Belgrave and Kyunghyun Cho},
25
+ year={2022},
26
+ url={https://openreview.net/forum?id=YPpSngE-ZU}
27
+ }
28
+
29
+ @article{riebesell2023matbench,
30
+ title={Matbench Discovery--An evaluation framework for machine learning crystal stability prediction},
31
+ author={Riebesell, Janosh and Goodall, Rhys EA and Jain, Anubhav and Benner, Philipp and Persson, Kristin A and Lee, Alpha A},
32
+ journal={arXiv preprint arXiv:2308.14920},
33
+ year={2023}
34
+ }
35
+
36
+ @misc {yuan_chiang_2023,
37
+ author = { {Yuan Chiang} },
38
+ title = { mace-universal (Revision e5ebd9b) },
39
+ year = 2023,
40
+ url = { https://huggingface.co/cyrusyc/mace-universal },
41
+ doi = { 10.57967/hf/1202 },
42
+ publisher = { Hugging Face }
43
+ }
44
+ ```
45
+
46
+ # Training Details
47
+
48
+ ## Training Data
49
+
50
+ <!-- This should link to a Data Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
51
+
52
+
53
+ ## Training Procedure