ssmits commited on
Commit
03e87f8
·
verified ·
1 Parent(s): 4282a0f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -16,7 +16,8 @@ language:
16
  Falcon-11B is still undertrained, as can be seen by this graph:
17
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/660c0a02cf274b3ab77dd6b7/QeaL9bOrPskustzFpjMUP.png)
18
  This is why the choice is made by prune 50% of the layers.
19
- Note that ~1B of continued pre-training (~1M rows of 1k tokens) is still required to restore the perplexity of this model.
 
20
 
21
  # sliced
22
 
 
16
  Falcon-11B is still undertrained, as can be seen by this graph:
17
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/660c0a02cf274b3ab77dd6b7/QeaL9bOrPskustzFpjMUP.png)
18
  This is why the choice is made by prune 50% of the layers.
19
+ Note that \~1B of continued pre-training (\~1M rows of 1k tokens) is still required to restore the perplexity of this model in the desired language.
20
+ I'm planning on doing that for certain languages, depending on how much compute will be available.
21
 
22
  # sliced
23