Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,19 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
# Falcon3-7B-Instruct OpenVINO INT4
|
2 |
|
3 |
This repository contains the [tiiuae/Falcon3-7B-Instruct](https://huggingface.co/tiiuae/Falcon3-7B-Instruct) model optimized for inference with Intel's OpenVINO runtime. The model has been quantized to INT4 using the AWQ quantization scheme for improved performance while maintaining quality.
|
|
|
1 |
+
---
|
2 |
+
language:
|
3 |
+
- en
|
4 |
+
library_name: openvino
|
5 |
+
pipeline_tag: text-generation
|
6 |
+
base_model: tiiuae/Falcon3-7B-Instruct
|
7 |
+
tags:
|
8 |
+
- openvino
|
9 |
+
- optimized
|
10 |
+
- int4
|
11 |
+
- awq
|
12 |
+
- falcon
|
13 |
+
- falcon3
|
14 |
+
- instruction-tuned
|
15 |
+
---
|
16 |
+
|
17 |
# Falcon3-7B-Instruct OpenVINO INT4
|
18 |
|
19 |
This repository contains the [tiiuae/Falcon3-7B-Instruct](https://huggingface.co/tiiuae/Falcon3-7B-Instruct) model optimized for inference with Intel's OpenVINO runtime. The model has been quantized to INT4 using the AWQ quantization scheme for improved performance while maintaining quality.
|