File size: 1,538 Bytes
9ba07a3 160a6b1 4081464 945a98b 4cefa76 4e137dc ef001a4 4e137dc ef001a4 94a7b93 3e7b7de 160a6b1 4cefa76 ef001a4 160a6b1 4e137dc 3e7b7de 4e137dc |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 |
---
title: README
emoji: π
colorFrom: gray
colorTo: green
sdk: static
pinned: false
---
<div align="center">
<img src="https://cdn-uploads.huggingface.co/production/uploads/656d9adce8bf55919aca7c3f/j4NWHo9opdcv3jV69T91k.png" style="width: 65%">
</div>
OpenChat Logo Online Demo | GitHub Logo GitHub | ArXiv Logo Paper | Discord Logo Discord
Welcome to Yi! π
Yi model family is a series of language and multimodal models. It is based on 6B and 34B pretrained language models, then extended to **chat** models, 200K **long context** models, depth-upscaled models (9B), and **vision**-language models.
# βοΈ Highlights
- **Strong performance**: Yi-1.5-34B is on par with or surpasses GPT-3.5 in commonsense reasoning, college exams, math, coding, reading comprehension, and human preference win-rate on multiple evaluation benchmarks.
- **Cost-effective**: For 6B, 9B, and 34B, you can perform inference on consumer-grade hardware (like the RTX 4090). Additionally, 34B is large enough with complex reasoning and emergent abilities, giving a nice performance-cost balance.
# π Benchmarks
TBD
<div align="center">
<img src="https://cdn-uploads.huggingface.co/production/uploads/656d9adce8bf55919aca7c3f/kVHWz7yEY3UJlcRD2nwf2.png" style="width: 65%">
</div>
# π° News
- 2024-03-16: The Yi-9B-200K is open-sourced and available to the public.
- 2024-03-08: Yi Tech Report is published!
- 2024-03-07: The long text capability of the Yi-34B-200K has been enhanced.
For complete news history, see [News](xx.md).
|