SmolLM2 Fine-tuned Model

This is a fine-tuned version of SmolLM2-135M. The model was trained for 5050 steps.

Model Details

  • Base model: SmolLM2-135M
  • Training steps: 5050
  • Parameters: 135M
  • Context length: 2048
  • Trained on custom text data
Downloads last month
33
Inference Examples
Unable to determine this model's library. Check the docs .

Space using jatingocodeo/SmolLM2 1