|
--- |
|
library_name: transformers |
|
license: apache-2.0 |
|
language: |
|
- sk |
|
base_model: |
|
- mistralai/Mistral-7B-v0.1 |
|
--- |
|
|
|
# Model Card for Mistral-sk-7B-v0.1 |
|
|
|
Mistral-sk-7B-v0.1 is a Slovak language version of the Mistral-7B-v0.1 large language model generative text model with 7 billion parameters. |
|
|
|
## Model Details |
|
|
|
Mistral-sk-7B-v0.1 is a Slovak language model obtained by full parameter finetuning of the Mistral-7B-v0.1 large language model with the data from the Araneum Slovacum VII Maximum web corpus. |
|
The model was developed in collaboration of Department of Cybernetics and Artificial Intelligence, Faculty of Electrical Engineering and Informatics, Technical University of Košice; Centre of Social and Psychological Sciences of the Slovak Academy of Sciences and Ľ. Štúr Institute of Linguistics, Slovak Academy of Sciences. |
|
This is a base pre-trained model that can be used for further finetuning, obtained by additional training of the model with Slovak language text(s). |
|
|
|
- **Developed by:** Department of Cybernetics and Artificial Intelligence, Faculty of Electrical Engineering and Informatics, Technical University of Košice; Centre of Social and Psychological Sciences of the Slovak Academy of Sciences and Ľ. Štúr Institute of Linguistics, Slovak Academy of Sciences |
|
- **Language:** Slovak |
|
- **License:** Apache license 2.0 |
|
- **Finetuned from model:** Mistral-7B-v0.1 |
|
|
|
## Supported by |
|
|
|
- Part of the Research results was obtained using the computational resources procured in the national project National competence centre for high performance computing (project code: 311070AKF2) funded by European Regional Development Fund, EU Structural Funds Informatization of society, Operational Program Integrated Infrastructure. |
|
- DiusAI a. s. |
|
|