xdatasi's picture
Update README.md
8ca5c1b verified
metadata
library_name: transformers
license: apache-2.0
language:
  - sl
  - en
tags:
  - llama2
  - Mixtral
  - Slovenian

AntaresAI

We introduce Antares-7b-slovenian, an instruction-tuned and alignment model based on Mixtral-8x7B-v0.1 and Llama-2-70b-hf finetuned for Slovenian language.

Please refer to the evaluation results table for details.

Instruction Fine-tuning Strategy

We utilize state-of-the-art instruction fine-tuning methods including supervised fine-tuning (SFT) and direct preference optimization (DPO)

Data Contamination Test Results

Results will be updated soon.

Evaluation Results

Results will be updated soon.

Contact Us

Any questions and suggestions are welcomed at the discussion tab.