Transformers
English
Inference Endpoints
norabelrose commited on
Commit
8be3d3a
·
verified ·
1 Parent(s): 41c942e

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +17 -0
README.md ADDED
@@ -0,0 +1,17 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ datasets:
4
+ - togethercomputer/RedPajama-Data-V2
5
+ language:
6
+ - en
7
+ library_name: transformers
8
+ ---
9
+
10
+ This is a set of sparse autoencoders (SAEs) trained on the residual stream of [Llama 3.1 8B](https://huggingface.co/meta-llama/Meta-Llama-3.1-8B) using the 10B sample of the [RedPajama v2 corpus](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-V2), which comes out to roughly 8.5B tokens using the Llama 3 tokenizer. The SAEs are organized by hookpoint, and can be loaded using the EleutherAI [`sae` library](https://github.com/EleutherAI/sae).
11
+
12
+ With the `sae` library installed, you can access an SAE like this:
13
+ ```python
14
+ from sae import Sae
15
+
16
+ sae = Sae.load_from_hub("EleutherAI/sae-llama-3-8b-32x-v2", hookpoint="layers.23.mlp")
17
+ ```