|
--- |
|
library_name: transformers |
|
license: apache-2.0 |
|
language: |
|
- sk |
|
base_model: |
|
- mistralai/Mistral-7B-v0.1 |
|
--- |
|
|
|
# Model Card for mistral-sk-7b |
|
|
|
**mistral-sk-7b** is a Slovak language version of the Mistral-7B-v0.1 large language model with 7 billion parameters. |
|
|
|
## Model Details |
|
|
|
**mistral-sk-7b** is a Slovak language model obtained by full parameter finetuning of the Mistral-7B-v0.1 large language model with the data from the Araneum Slovacum VII Maximum web corpus. |
|
The model was developed in collaboration of Department of Cybernetics and Artificial Intelligence, Faculty of Electrical Engineering and Informatics, Technical University of Košice; Centre of Social and Psychological Sciences of the Slovak Academy of Sciences and Ľ. Štúr Institute of Linguistics, Slovak Academy of Sciences. |
|
This is a base pre-trained model that can be used for further finetuning for the downstream tasks in Slovak language. Note that this model does not have any moderation mechanisms. |
|
|
|
- **Language:** Slovak |
|
- **License:** Apache license 2.0 |
|
- **Finetuned from model:** Mistral-7B-v0.1 |
|
- **Authors:** |
|
- Peter Bednár, Department of Cybernetics and Artificial Intelligence, Faculty of Electrical Engineering and Informatics, Technical University of Košice |
|
- Marek Dobeš, Centre of Social and Psychological Sciences of the Slovak Academy of Sciences and ČZ o.z. |
|
- Radovan Garabík, Ľ. Štúr Institute of Linguistics, Slovak Academy of Sciences, supported by DiusAI a. s. |
|
|
|
## Supported by |
|
|
|
- Part of the Research results was obtained using the high performance computing resources operated by CINECA and awarded within the the National Leonardo access call 2023 by the Centre of Operations, Slovak Academy of Sciences and the Slovak National Supercomputing centre. |
|
|
|
|