Model Card for distilbert-anli

Model Details

Model Description

A fine-tuned version of distilbert/distilbert-base-uncased using the facebook/anli dataset.

  • Developed by: Karl Weinmeister
  • Language(s) (NLP): en
  • License: apache-2.0
  • Finetuned from model [optional]: distilbert/distilbert-base-uncased

Training Hyperparameters

  • Training regime: The model was trained for 5 epochs with batch size 128.
Downloads last month
8
Safetensors
Model size
67M params
Tensor type
F32
·
Inference Examples
Inference API (serverless) does not yet support transformers models for this pipeline type.

Model tree for kweinmeister/distilbert-anli

Finetuned
(7081)
this model
Merges
1 model

Dataset used to train kweinmeister/distilbert-anli

Evaluation results