File size: 1,943 Bytes
1040279 65ae860 e9c9562 07e7b13 e9c9562 1040279 65ae860 1040279 d54f82e 093d9b6 1040279 e9c9562 1040279 e9c9562 1040279 e9c9562 1040279 97e764a 94ddbf5 e9c9562 1040279 e9c9562 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 |
---
library_name: transformers
model_name: Vikhr-2-VL-2b-Instruct-experimental
base_model:
- Qwen/Qwen2-VL-2B
language:
- ru
- en
license: apache-2.0
---
# 💨👁️ Vikhr-2-VL-2b-Instruct-experimental
**Vikhr-2-VL-2b-Instruct-experimental** — это компактная VLM модель на базе [Qwen/Qwen2-VL-2B](https://huggingface.co/Qwen/Qwen2-VL-2B), обученная на переведенном датасете **LLAVA-150K**, специально доученная для обработки на русском языке.
Дообученная модель является эксперементальной и не всегда будет работать ожидаемо (особенно OCR). Для обратной связи используйте [Vikhr Models](https://t.me/vikhrlabs)
## Попробовать / Try now:
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/18n9_Aylc87EviAgZeQjlGTLGz-FQ2Q4l?usp=sharing)
### Авторы / Authors
- Nikolay Kompanets, [LakoMoor](https://t.me/lakomoordev), [Vikhr Team](https://t.me/vikhrlabs)
- Sergey Bratchikov, [NlpWonder](https://t.me/nlpwanderer), [Vikhr Team](https://t.me/vikhrlabs)
- Konstantin Korolev, [underground](https://t.me/mlunderground), [Vikhr Team](https://t.me/vikhrlabs)
- Aleksandr Nikolich, [Vikhr Team](https://t.me/vikhrlabs)
```
@inproceedings{nikolich2024vikhr,
title={Vikhr: Constructing a State-of-the-art Bilingual Open-Source Instruction-Following Large Language Model for {Russian}},
author={Aleksandr Nikolich and Konstantin Korolev and Sergei Bratchikov and Nikolay Kompanets and Igor Kiselev and Artem Shelmanov },
booktitle = {Proceedings of the 4rd Workshop on Multilingual Representation Learning (MRL) @ EMNLP-2024}
year={2024},
publisher = {Association for Computational Linguistics},
url={https://arxiv.org/pdf/2405.13929}
}
``` |