rdiehlmartinez commited on
Commit
0ed7d8c
Β·
verified Β·
1 Parent(s): ce5933e

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +55 -0
README.md ADDED
@@ -0,0 +1,55 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ datasets:
4
+ - pico-lm/pretokenized-paloma
5
+ language:
6
+ - en
7
+ metrics:
8
+ - pico-lm/perplexity
9
+ pipeline_tag: text-generation
10
+ ---
11
+
12
+ # Pico Decoder Tiny
13
+
14
+ **pico-decoder-tiny** is the smallest (11M) model in the `pico-decoder` suite β€” a lightweight, LLaMA-style decoder-only transformer trained from scratch using [`pico-train`](https://github.com/pico-lm/pico-train). It is designed for transparent and reproducible research into the learning dynamics of language models, and is fully compatible with the `pico-analyze` toolkit for detailed interpretability analysis.
15
+
16
+ ## πŸ”§ Model Details
17
+
18
+ | Field | Value |
19
+ |---------------------|------------------------------------|
20
+ | **Architecture** | Decoder-only transformer (LLaMA-style) |
21
+ | **Parameters** | 11M |
22
+ | **Layers** | 12 |
23
+ | **Hidden Size** | 96 |
24
+ | **Feed Foward Size** | 384 |
25
+ | **Attention Heads** | 12 |
26
+ | **Key/Value Heads** | 4 |
27
+
28
+ ## πŸ“š Training
29
+
30
+ - **Dataset**: [`pretokenized-dolma`](https://huggingface.co/datasets/pico-lm/pretokenized-dolma), English-only
31
+ - **Training steps**: 200,000
32
+ - **Batch size**: 1024
33
+ - **Sequence length**: 2048
34
+ - **Optimizer**: AdamW
35
+ - **Learning rate schedule**: Linear decay with warmup
36
+ - **Compute**: 16 A100-SXM4-80GB GPUs
37
+
38
+ ## πŸ“ˆ Evaluation and Analysis
39
+
40
+ This model supports fine-grained analysis using [`pico-analyze`](https://github.com/pico-lm/pico-analyze). This tool enables researchers to understand how learning unfolds over training, even at very small scales.
41
+
42
+ We also evaluate perplexity of the model on the [`pico-paloma-tinsy`](https://huggingface.co/datasets/pico-lm/pretokenized-paloma-tinsy) dataset.
43
+
44
+ ## πŸ“„ Citation
45
+
46
+ If you use `pico-tiny` or any other `pico-decoder` model in your research, please cite:
47
+
48
+ ```bibtex
49
+ @software{pico2025,
50
+ author = {Diehl Martinez, Richard},
51
+ title = {Pico: A Lightweight Framework for Studying Language Model Learning Dynamics},
52
+ year = {2025,
53
+ url = {https://github.com/pico-lm}
54
+ }
55
+ ```