This is a smaller version of the google/mt5-base model with only Spanish embeddings left.

  • The original model has 582M parameters, with 237M of them being input and output embeddings.
  • After shrinking the sentencepiece vocabulary from 250K to 25K (top 25K Spanish tokens) the number of model parameters reduced to 237M parameters, and model size reduced from 2.2GB to 0.9GB - 42% of the original one.

Citing & Authors

Downloads last month
20
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for hiiamsid/est5-base

Finetunes
1 model