tinyllama-110M / README.md
nickypro's picture
Update README.md
4b21c4a
|
raw
history blame
286 Bytes
---
license: mit
---
This is the 110M parameter Llama 2 architecture model trained on the TinyStories dataset.
These are converted from
[karpathy/tinyllamas](https://huggingface.co/karpathy/tinyllamas).
See the [llama2.c](https://github.com/karpathy/llama2.c) project for more details.