File size: 647 Bytes
1f7a0b8 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 |
# CLIP Transcoder Checkpoint
This model is a Transcoder trained on CLIP's internal representations.
## Model Details
### Architecture
- **Layer**: 11
- **Layer Type**: ln2.hook_normalized
- **Model**: laion/CLIP-ViT-B-32-DataComp.XL-s13B-b90K
- **Dictionary Size**: 49152
- **Input Dimension**: 768
- **Expansion Factor**: 64
- **CLS Token Only**: False
### Training
- **Learning Rate**: 0.0012530554819595529
- **Batch Size**: 4096
- **Context Size**: 50
### Sparsity
- **L0 (Active Features)**: 497
## Additional Information
- **Wandb Run**: https://wandb.ai/perceptual-alignment/openclip-transcoders/runs/oud9jpdn/
- **Random Seed**: 42
|