metadata
language:
- en
- de
- fr
- fa
- ar
- tr
- es
- it
metrics:
- accuracy
pipeline_tag: document-question-answering
tags:
- text-generation-inference
This model is 4bit quantized of glm-4v-9b Model and fixed some error to executing on google colab.
It has exciting result with less then 10 Giga VRAM (Multi Modal Multi Language).