Model Card for Model ID
This model has been created with Argilla, trained with Transformers.
Model training
Training the model using the ArgillaTrainer
:
# Load the dataset:
dataset = FeedbackDataset.from_argilla("...")
# Create the training task:
def formatting_func(sample):
text = sample["text"]
label = sample["label"][0]["value"]
return(text, label)
task = TrainingTask.for_text_classification(formatting_func=formatting_func)
# Create the ArgillaTrainer:
trainer = ArgillaTrainer(
dataset=dataset,
task=task,
framework="transformers",
model="bert-base-cased",
)
trainer.update_config({
"evaluation_strategy": "epoch",
"logging_dir": "./logs",
"logging_steps": 1,
"num_train_epochs": 1,
"output_dir": "textcat_model_transformers",
"use_mps_device": true
})
trainer.train(output_dir="None")
You can test the type of predictions of this model like so:
trainer.predict("This is awesome!")
Model Details
Model Description
- Developed by: [More Information Needed]
- Shared by [optional]: [More Information Needed]
- Model type: [More Information Needed]
- Language(s) (NLP): [More Information Needed]
- License: [More Information Needed]
- Finetuned from model [optional]: [More Information Needed]
Technical Specifications [optional]
Framework Versions
- Python: 3.9.17
- Argilla: 1.21.0-dev
- Downloads last month
- 17
Inference API (serverless) does not yet support Transformers models for this pipeline type.