Dan Fu
8k retrieval
a512f44
|
raw
history blame
544 Bytes
---
license: apache-2.0
language:
- en
pipeline_tag: text-classification
---
# Monarch Mixer-BERT
The 80M checkpoint for M2-BERT-base from the paper [Monarch Mixer: A Simple Sub-Quadratic GEMM-Based Architecture](https://arxiv.org/abs/2310.12109).
This model has been pretrained with sequence length 8192, and it has been fine-tuned for retrieval.
This model was trained by Dan Fu, Jon Saad-Falcon, and Simran Arora.
Check out our [GitHub](https://github.com/HazyResearch/m2/tree/main) for instructions on how to download and fine-tune it!