|
--- |
|
license: apache-2.0 |
|
tags: |
|
- generated_from_trainer |
|
metrics: |
|
- rouge |
|
model-index: |
|
- name: DaMedSum-small |
|
results: [] |
|
language: |
|
- da |
|
--- |
|
|
|
|
|
``` |
|
_____ ______ __ __ ______ _____ ______ __ __ __ __ |
|
/\ __-. /\ __ \ /\ "-./ \ /\ ___\ /\ __-. /\ ___\ /\ \/\ \ /\ "-./ \ |
|
\ \ \/\ \\ \ __ \\ \ \-./\ \\ \ __\ \ \ \/\ \\ \___ \\ \ \_\ \\ \ \-./\ \ |
|
\ \____- \ \_\ \_\\ \_\ \ \_\\ \_____\\ \____- \/\_____\\ \_____\\ \_\ \ \_\ |
|
\/____/ \/_/\/_/ \/_/ \/_/ \/_____/ \/____/ \/_____/ \/_____/ \/_/ \/_/ |
|
|
|
``` |
|
|
|
## DaMedSum |
|
|
|
This repository contains a model for Danish abstractive summarisation of medicaltext. |
|
|
|
This model is a fine-tuned version of DanSumT5-small trained on a danish medical text dataset. |
|
|
|
The model was trained on LUMI using 1 AMD MI250X GPU. |
|
|
|
## Authors |
|
Nicolaj Larsen |
|
Mikkel Kildeberg |
|
Emil Schledermann |
|
|
|
### Framework versions |
|
|
|
- Transformers 4.30.2 |
|
- Pytorch 1.12.1+git7548e2f |
|
- Datasets 2.13.2 |
|
- Tokenizers 0.13.3 |