ssmits commited on
Commit
13633e2
·
verified ·
1 Parent(s): e599f82

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -0
README.md CHANGED
@@ -22,6 +22,14 @@ language:
22
  - it
23
  - cs
24
  ---
 
 
 
 
 
 
 
 
25
  # sliced
26
 
27
  This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
 
22
  - it
23
  - cs
24
  ---
25
+ ## Why prune?
26
+
27
+ Falcon-11B is still undertrained, as can be seen by this graph:
28
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/660c0a02cf274b3ab77dd6b7/QeaL9bOrPskustzFpjMUP.png)
29
+ This is why the choice is made by prune 50% of the layers.
30
+ Note that \~1B of continued pre-training (\~1M rows of 1k tokens) is still required to restore the perplexity of this model in the desired language.
31
+ I'm planning on doing that for certain languages, depending on how much compute will be available.
32
+
33
  # sliced
34
 
35
  This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).