Model Summary

GritLM is a generative representational instruction tuned language model. It unifies text representation (embedding) and text generation into a single model achieving state-of-the-art performance on both types of tasks.

Model Description
GritLM 7B Mistral 7B finetuned using GRIT
GritLM 8x7B Mixtral 8x7B finetuned using GRIT

Use

The model usage is documented here.

Citation

@misc{muennighoff2024generative,
      title={Generative Representational Instruction Tuning}, 
      author={Niklas Muennighoff and Hongjin Su and Liang Wang and Nan Yang and Furu Wei and Tao Yu and Amanpreet Singh and Douwe Kiela},
      year={2024},
      eprint={2402.09906},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}
Downloads last month
17,778
Safetensors
Model size
7.24B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for GritLM/GritLM-7B

Adapters
3 models
Merges
6 models
Quantizations
4 models

Dataset used to train GritLM/GritLM-7B

Spaces using GritLM/GritLM-7B 14

Collection including GritLM/GritLM-7B

Evaluation results