|
--- |
|
title: README |
|
emoji: π |
|
colorFrom: gray |
|
colorTo: green |
|
sdk: static |
|
pinned: false |
|
--- |
|
|
|
<div align="center"> |
|
<img src="https://cdn-uploads.huggingface.co/production/uploads/656d9adce8bf55919aca7c3f/AI0k5c4O_Y2tidEBTxMdw.jpeg" style="width: 10%"> |
|
</div> |
|
|
|
<div align="center"> |
|
<img src="https://cdn-uploads.huggingface.co/production/uploads/656d9adce8bf55919aca7c3f/xcy1rwFGbrVZ1N68LQcEI.gif" style="width: 65%"> |
|
</div> |
|
|
|
<p style="margin-top: 0px;" align="center"> |
|
<a rel="nofollow" href="https://discord.gg/hYUwWddeAu"> |
|
<img style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 10px; margin-top: 0px; margin-bottom: 0px;" alt="Discord Logo" src="https://cloud.githubusercontent.com/assets/6291467/26705903/96c2d66e-477c-11e7-9f4e-f3c0efe96c9a.png"> |
|
<span class="link-text">Discord</span> |
|
</a> | β’ |
|
<a rel="nofollow" href="https://arxiv.org/abs/2403.04652"> |
|
<img style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 10px; margin-top: 0px; margin-bottom: 0px;" alt="ArXiv Logo" src="https://github.com/alpayariyak/openchat/blob/master/assets/arxiv-logomark-small-square-border.png?raw=true"> |
|
<span class="link-text">Paper</span> |
|
</a> |
|
</p> |
|
|
|
|
|
<div class="grid lg:grid-cols-3 gap-x-4 gap-y-7"> |
|
<a href="https://www.01.ai/" class="block overflow-hidden group"> |
|
<div |
|
class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center bg-[#FFFFFF]" |
|
> |
|
<img alt="" src="https://cdn-uploads.huggingface.co/production/uploads/656d9adce8bf55919aca7c3f/ADRH_f61dt8uVWBsAehkG.gif" class="w-40" /> |
|
</div> |
|
<div align="center">Base Models<br/> Yi-6B/9B/34B</div> |
|
</a> |
|
<a |
|
href="https://www.01.ai/" |
|
class="block overflow-hidden" |
|
> |
|
<div class="flex items-center h-40 bg-[#FFFFFF] rounded-lg px-4 mb-2"> |
|
<img alt="" src="https://cdn-uploads.huggingface.co/production/uploads/656d9adce8bf55919aca7c3f/JuiI5Zun1XD5BuHCK0L1I.gif" class="w-40" /> |
|
</pre> |
|
</div> |
|
<div align="center">Chat Models <br/> Yi-6B/9B/34B Chat</div> |
|
</a> |
|
<a |
|
href="https://www.01.ai/" |
|
class="block overflow-hidden group" |
|
> |
|
<div class="flex items-center h-40 bg-[#FFFFFF] rounded-lg px-4 mb-2"> |
|
<img alt="" src="https://cdn-uploads.huggingface.co/production/uploads/656d9adce8bf55919aca7c3f/7D6SjExHLIO1cH0tmQyxh.gif" class="w-40" /> |
|
</div> |
|
<div align="center" class="underline">Multimodal Models <br/> Yi-VL-6B/34B</div> |
|
</a> |
|
<div class="lg:col-span-3"> |
|
<p class="mb-4"> |
|
</p> |
|
|
|
Welcome to Yi! π |
|
|
|
Yi model family is a series of language and multimodal models. It is based on 6B and 34B pretrained language models, then extended to **chat** models, 200K **long context** models, depth-upscaled models (9B), and **vision**-language models. |
|
|
|
# βοΈ Highlights |
|
|
|
- **Strong performance**: Yi-1.5-34B is on par with or surpasses GPT-3.5 in commonsense reasoning, college exams, math, coding, reading comprehension, and human preference win-rate on multiple evaluation benchmarks. |
|
|
|
- **Cost-effective**: For 6B, 9B, and 34B, you can perform inference on consumer-grade hardware (like the RTX 4090). Additionally, 34B is large enough with complex reasoning and emergent abilities, giving a nice performance-cost balance. |
|
|
|
# π Benchmarks |
|
|
|
TBD |
|
|
|
<div align="center"> |
|
<img src="https://cdn-uploads.huggingface.co/production/uploads/656d9adce8bf55919aca7c3f/kVHWz7yEY3UJlcRD2nwf2.png" style="width: 65%"> |
|
</div> |
|
|
|
|
|
# π° News |
|
|
|
- 2024-03-16: The Yi-9B-200K is open-sourced and available to the public. |
|
|
|
- 2024-03-08: Yi Tech Report is published! |
|
|
|
- 2024-03-07: The long text capability of the Yi-34B-200K has been enhanced. |
|
|
|
For complete news history, see [News](xx.md). |
|
|