File size: 411 Bytes
cffc544 |
1 2 3 4 5 6 7 8 9 10 11 |
---
license: apache-2.0
datasets:
- simecek/wikipedie_20230601
language:
- cs
---
This is a Mistral7B model fine-tuned with QLoRA on Czech Wikipedia data. The model is primarily designed for further fine-tuning for Czech-specific NLP tasks, including summarization and question answering. This adaptation allows for better performance in tasks that require an understanding of the Czech language and context.
|