beccabai's picture
Add pipeline tag (#1)
07c5be9 verified
|
raw
history blame
303 Bytes
metadata
pipeline_tag: text-generation
datasets:
  - cerebras/SlimPajama-627B
language:
  - en

This repo contains the trained 1.3 billion parameter LLAMA-2 architecture model checkpoints for the work Multi-Agent Collaborative Data Selection for Efficient LLM Pretraining.