Update README.md
Browse files
README.md
CHANGED
@@ -11,7 +11,7 @@ FalconLite2 evolves from [FalconLite](https://huggingface.co/amazon/FalconLite),
|
|
11 |
|Model|Fine-tuned on long contexts| Quantization | Max context length| RotaryEmbedding adaptation| Inference framework|
|
12 |
|----------|-------------:|-------------:|------------:|-----------:|-----------:|
|
13 |
| FalconLite | No | 4-bit GPTQ |12K | [dNTK](https://www.reddit.com/r/LocalLLaMA/comments/14mrgpr/dynamically_scaled_rope_further_increases/) | TGI 0.9.2 |
|
14 |
-
| FalconLite2 | Yes | 4-bit GPTQ |24K | rope_theta = 1000000 | TGI 1.0.3 |
|
15 |
|
16 |
## Model Details
|
17 |
|
@@ -38,6 +38,15 @@ cd falconlite-dev/falconlite2
|
|
38 |
./docker_build.sh
|
39 |
./start_falconlite.sh
|
40 |
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
41 |
### Perform inference
|
42 |
```bash
|
43 |
# after FalconLite has been completely started
|
|
|
11 |
|Model|Fine-tuned on long contexts| Quantization | Max context length| RotaryEmbedding adaptation| Inference framework|
|
12 |
|----------|-------------:|-------------:|------------:|-----------:|-----------:|
|
13 |
| FalconLite | No | 4-bit GPTQ |12K | [dNTK](https://www.reddit.com/r/LocalLLaMA/comments/14mrgpr/dynamically_scaled_rope_further_increases/) | TGI 0.9.2 |
|
14 |
+
| FalconLite2 | Yes | 4-bit GPTQ |24K | rope_theta = 1000000 | TGI 1.0.3 & TGI 1.1.0 |
|
15 |
|
16 |
## Model Details
|
17 |
|
|
|
38 |
./docker_build.sh
|
39 |
./start_falconlite.sh
|
40 |
```
|
41 |
+
### Start TGI server-1.1.0
|
42 |
+
```bash
|
43 |
+
git clone https://github.com/awslabs/extending-the-context-length-of-open-source-llms.git falconlite-dev
|
44 |
+
cd falconlite-dev/falconlite2-tgi1.1.0
|
45 |
+
# this may take a while to build updated vLLM CUDA kernels
|
46 |
+
./docker_build_rebuild_vllm_rope-theta.sh
|
47 |
+
./start_falconlite.sh
|
48 |
+
```
|
49 |
+
|
50 |
### Perform inference
|
51 |
```bash
|
52 |
# after FalconLite has been completely started
|