Transformers
English
Inference Endpoints

This is a set of sparse autoencoders (SAEs) trained on the residual stream of Llama 3 8B using the RedPajama corpus. The SAEs are organized by layer, and can be loaded using the EleutherAI sae library.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model’s pipeline type.

Dataset used to train EleutherAI/sae-llama-3-8b-32x