lhallee commited on
Commit
ce7dbf9
·
verified ·
1 Parent(s): 182b3e7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -8,7 +8,7 @@ tags: []
8
 
9
  FastESM is a Huggingface compatible plug in version of ESM2-650M rewritten with a newer PyTorch Attention implementation.
10
 
11
- To enhance the weights with longer context and better fp16 support, we trained ESM2-650 50000 additional steps in fp16 mixed precision on [OMGprot50](tattabio/OMG_prot50) up to sequence length of **2048**.
12
 
13
  Outputting attentions and predicting contacts are not possible from SDPA. Various other optimizations also make the base implementation slightly different than the one in transformers.
14
 
 
8
 
9
  FastESM is a Huggingface compatible plug in version of ESM2-650M rewritten with a newer PyTorch Attention implementation.
10
 
11
+ To enhance the weights with longer context and better fp16 support, we trained ESM2-650 50000 additional steps with a traditional MLM objective (20% masking) in fp16 mixed precision on [OMGprot50](tattabio/OMG_prot50) up to sequence length of **2048**.
12
 
13
  Outputting attentions and predicting contacts are not possible from SDPA. Various other optimizations also make the base implementation slightly different than the one in transformers.
14