NGME-LLama 264M

  • Trained on 4 A6000 for ~4 days
  • Trained ~4 Billion (4 * 16 * 768 * 100_000) Tokens
  • On C4 Corpus
Downloads last month
10
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train PatrickHaller/ngme-llama-264M