This repo contains YugoGPT - the best open-source base 7B LLM for BCS (Bosnian, Croatian, Serbian) languages developed by Aleksa Gordić.

You can access more powerful iterations of YugoGPT already through the recently announced RunaAI's API platform!

Serbian LLM eval results compared to Mistral 7B, LLaMA 2 7B, and GPT2-orao (also see this LinkedIn post): image/jpeg

Eval was computed using https://github.com/gordicaleksa/serbian-llm-eval

It was trained on tens of billions of BCS tokens and is based off of Mistral 7B.

Notes

  1. YugoGPT is a base model and therefore does not have any moderation mechanisms.

  2. Since it's a base model it won't follow your instructions as it's just a powerful autocomplete engine.

  3. If you want an access to much more powerful BCS LLMs (some of which are powering yugochat) - you can access the models through RunaAI's API

Credits

The data for the project was obtained with the help of Nikola Ljubešić, CLARIN.SI, and CLASSLA. Thank you!

Project Sponsors

A big thank you to the project sponsors!

Platinum sponsors 🌟

Gold sponsors 🟡

Silver sponsors ⚪

psk.rs, OmniStreak, Luka Važić, Miloš Durković, Marjan Radeski, Marjan Stankovic, Nikola Stojiljkovic, Mihailo Tomić, Bojan Jevtic, Jelena Jovanović, Nenad Davidović, Mika Tasich, TRENCH-NS, Nemanja Grujičić, tim011

Also a big thank you to the following individuals:

Citation

@article{YugoGPT,
  author    = "Gordić Aleksa",
  title     = "YugoGPT - an open-source LLM for Serbian, Bosnian, and Croatian languages",
  year      = "2024"
  howpublished = {\url{https://huggingface.co/gordicaleksa/YugoGPT}},
}
Downloads last month
625
Safetensors
Model size
7.24B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for gordicaleksa/YugoGPT

Quantizations
8 models

Spaces using gordicaleksa/YugoGPT 8