This is a tiny, dummy version of Jamba, used for debugging and experimentation over the Jamba architecture.
It has 128M parameters (instead of 52B), and is initialized with random weights and did not undergo any training.
- Downloads last month
- 5,268
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support