puneeshkhanna commited on
Commit
c984855
·
verified ·
1 Parent(s): addb1af

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -3
README.md CHANGED
@@ -20,7 +20,7 @@ This repository contains the **Falcon3-3B-Base**. It achieves strong results on
20
  Falcon3-3B-Base supports 4 languages (english, french, spanish, portuguese) and a context length up to 8K.
21
  Falcon3-3B-Base pruned (depth + width) from Falcon3-7B-Base, was effeciently trained on only 100 GT using a knowledge distillation objective.
22
 
23
- ⚠️ **This is a raw, pretrained model, which should be further finetuned for most usecases.**
24
 
25
  ## Model Details
26
  - Architecture
@@ -29,8 +29,9 @@ Falcon3-3B-Base pruned (depth + width) from Falcon3-7B-Base, was effeciently tra
29
  - Grouped query attention (GQA) for faster inference: 12 query heads and 4 KV heads
30
  - Wider head dimension: 256
31
  - High RoPE value to support long context understanding: 1000042
32
- - 8k context length
33
- - 131k vocab size
 
34
  - Pruned and Healed from Falcon3-7B-Base on only 100 Gigatokens of datasets comprising of web, code, STEM, high quality and mutlilingual data using 2048 H100 GPU chips
35
  - Supports EN, FR, ES, PT
36
  - Developed by [Technology Innovation Institute](https://www.tii.ae)
 
20
  Falcon3-3B-Base supports 4 languages (english, french, spanish, portuguese) and a context length up to 8K.
21
  Falcon3-3B-Base pruned (depth + width) from Falcon3-7B-Base, was effeciently trained on only 100 GT using a knowledge distillation objective.
22
 
23
+ ⚠️ **This is a raw, pretrained model, which should be further finetuned using SFT, RLHF, continued pretraining, etc. for most use cases.**
24
 
25
  ## Model Details
26
  - Architecture
 
29
  - Grouped query attention (GQA) for faster inference: 12 query heads and 4 KV heads
30
  - Wider head dimension: 256
31
  - High RoPE value to support long context understanding: 1000042
32
+ - Uses SwiGLu and RMSNorm
33
+ - 8K context length
34
+ - 131K vocab size
35
  - Pruned and Healed from Falcon3-7B-Base on only 100 Gigatokens of datasets comprising of web, code, STEM, high quality and mutlilingual data using 2048 H100 GPU chips
36
  - Supports EN, FR, ES, PT
37
  - Developed by [Technology Innovation Institute](https://www.tii.ae)