tim-lawson commited on
Commit
2d6934f
1 Parent(s): 10fd11e

Push model using huggingface_hub.

Browse files
Files changed (1) hide show
  1. README.md +6 -2
README.md CHANGED
@@ -6,16 +6,20 @@ tags:
6
  - arxiv:2409.04185
7
  - model_hub_mixin
8
  - pytorch_model_hub_mixin
9
- base_model: EleutherAI/pythia-410m-deduped
10
  ---
11
 
12
- # Model Card for
13
 
14
  A Multi-Layer Sparse Autoencoder (MLSAE) trained on the residual stream activation
15
  vectors from [EleutherAI/pythia-410m-deduped](https://huggingface.co/EleutherAI/pythia-410m-deduped) with an
16
  expansion factor of R = 64 and sparsity k = 32, over 1 billion
17
  tokens from [monology/pile-uncopyrighted](https://huggingface.co/datasets/monology/pile-uncopyrighted).
18
 
 
 
 
 
 
19
  ### Model Sources
20
 
21
  - **Repository:** <https://github.com/tim-lawson/mlsae>
 
6
  - arxiv:2409.04185
7
  - model_hub_mixin
8
  - pytorch_model_hub_mixin
 
9
  ---
10
 
11
+ # Model Card for tim-lawson/sae-pythia-410m-deduped-x64-k32-tfm-layers-20
12
 
13
  A Multi-Layer Sparse Autoencoder (MLSAE) trained on the residual stream activation
14
  vectors from [EleutherAI/pythia-410m-deduped](https://huggingface.co/EleutherAI/pythia-410m-deduped) with an
15
  expansion factor of R = 64 and sparsity k = 32, over 1 billion
16
  tokens from [monology/pile-uncopyrighted](https://huggingface.co/datasets/monology/pile-uncopyrighted).
17
 
18
+
19
+ This model is a PyTorch Lightning MLSAETransformer module, which includes the underlying
20
+ transformer.
21
+
22
+
23
  ### Model Sources
24
 
25
  - **Repository:** <https://github.com/tim-lawson/mlsae>