File size: 642 Bytes
532f869
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
---
license: apache-2.0
pipeline_tag: text-generation
tags:
- pretrained
- mistral
- 7b
inference: true
widget:
- messages:
  - role: user
    content: What is your favorite condiment?
---

# Model Card for Mistral-7B-v0.2

The Mistral-7B-Instruct-v0.2 Large Language Model (LLM) was fine-tuned on top of Mistral-7B-v0.2.

Mistral-7B-v0.2 has the following changes compared to Mistral-7B-v0.1
- 32k context window (vs 8k context in v0.1)
- Rope-theta = 1e6
- No Sliding-Window Attention

For full details of this model please read our [paper](https://arxiv.org/abs/2310.06825) and [release blog post](https://mistral.ai/news/la-plateforme/).