Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
togethercomputer
/
evo-1-131k-base
like
85
Follow
Together
432
Text Generation
Transformers
Safetensors
stripedhyena
long context
deep signal processing
hybrid
biology
genomics
custom_code
arxiv:
7 papers
License:
apache-2.0
Model card
Files
Files and versions
Community
3
Train
Use this model
29194e9
evo-1-131k-base
4 contributors
History:
19 commits
pragaash
Add tokenizer import reference to auto_map in config.json.
29194e9
verified
9 months ago
.gitattributes
1.52 kB
initial commit
9 months ago
README.md
4.08 kB
Update README.md
9 months ago
cache.py
1.38 kB
init
9 months ago
config.json
1.7 kB
Add tokenizer import reference to auto_map in config.json.
9 months ago
configuration_hyena.py
3.13 kB
init
9 months ago
engine.py
13.5 kB
init
9 months ago
generation_config.json
69 Bytes
Upload model
9 months ago
layers.py
5.39 kB
init
9 months ago
model-00001-of-00003.safetensors
4.98 GB
LFS
Upload model
9 months ago
model-00002-of-00003.safetensors
4.93 GB
LFS
Upload model
9 months ago
model-00003-of-00003.safetensors
3 GB
LFS
Upload model
9 months ago
model.py
19.4 kB
init
9 months ago
model.safetensors.index.json
34.9 kB
Upload model
9 months ago
modeling_hyena.py
5.55 kB
init
9 months ago
positional_embeddings.py
4.94 kB
init
9 months ago
special_tokens_map.json
3 Bytes
Update byte tokenizer to be compatible with auto tokenizer and clean-up.
9 months ago
streamer.py
3.94 kB
init
9 months ago
tokenizer.py
4.37 kB
Remove tokenizer.json and replace tokenizer.py with correct version.
9 months ago
tokenizer_config.json
243 Bytes
Update byte tokenizer to be compatible with auto tokenizer and clean-up.
9 months ago
utils.py
2.87 kB
init
9 months ago