File size: 1,183 Bytes
d8210be
 
 
 
 
 
 
 
 
 
 
 
 
 
cd529ff
d8210be
 
 
 
1f1c12d
d8210be
1f1c12d
d8210be
 
 
 
 
 
 
 
 
 
6ad7be1
82e212b
d8210be
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
---
license: mit
pipeline_tag: image-text-to-text
library_name: transformers
base_model:
  - internlm/internlm2-chat-1_8b
base_model_relation: merge
language:
  - multilingual
tags:
  - internvl
  - vision
  - ocr
  - custom_code
  - moe
---

# Mono-InternVL-2B

This repository contains the instruction-tuned Mono-InternVL-2B model, which has 1.8B activated parameters (3B in total). It is built upon [internlm2-chat-1_8b](https://huggingface.co/internlm/internlm2-chat-1_8b).

Please refer to our [**paper**](https://huggingface.co/papers/2410.08202), [**project page**](https://internvl.github.io/blog/2024-10-10-Mono-InternVL/) and [**GitHub repository**](https://github.com/OpenGVLab/mono-internvl) for introduction and usage.



## Citation

If you find this project useful in your research, please consider citing:

```BibTeX
@article{luo2024mono,
  title={Mono-InternVL: Pushing the Boundaries of Monolithic Multimodal Large Language Models with Endogenous Visual Pre-training},
  author={Luo, Gen and Yang, Xue and Dou, Wenhan and Wang, Zhaokai and Liu, Jiawen and Dai, Jifeng and Qiao, Yu and Zhu, Xizhou},
  journal={arXiv preprint arXiv:2410.08202},
  year={2024}
}
```