GysBERT v1

This model is a Historical Language Model for Dutch coming from the MacBERTh project.

The architecture is based on BERT base uncased from the original BERT pre-training codebase. The training material comes mostly from the DBNL and the Delpher newspaper dump. The details can be found in the accompanying publication: Non-Parametric Word Sense Disambiguation for Historical Languages

The model has been successfully tested on Word Sense Disambiguation tasks as discussed in the referenced paper above.

An updated version with an enlarged pre-training dataset is due soon.

Downloads last month
515
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.

Model tree for emanjavacas/GysBERT

Finetunes
2 models