metadata
license: apache-2.0
datasets:
- simecek/wikipedie_20230601
language:
- cs
This is a Mistral7B model fine-tuned with QLoRA on Czech Wikipedia data. The model is primarily designed for further fine-tuning for Czech-specific NLP tasks, including summarization and question answering. This adaptation allows for better performance in tasks that require an understanding of the Czech language and context.