A newer version of this model is available: nur-dev/roberta-large-kazqad

Model Card for roberta-large-kazqad-informatics_kaz

This model is a fine-tuned version of nur-dev/roberta-large-kazqad, adapted for question-answering tasks in the Kazakh language using the Kundyzka/informatics_kaz dataset.

Model Details

Model Description

This model has been optimized for question-answering in Kazakh within the domain of informatics. It leverages the RoBERTa architecture and has been trained to provide accurate responses to technical and academic questions in this field.

  • Developed by: Tleubayeva Arailym, Saparbek Makhambet, Bassanova Nurgul, Shomanov Aday, Sabitkhanov Askhat
  • Model type: Transformer-based (RoBERTa)
  • Language(s) (NLP): Kazakh (kk)
  • License: apache-2.0

Evaluation

Results

EM - 56,69(+316.2%) F1 score- 69,70(+220.8%)

Model Card Authors

Tleubayeva Arailym

Saparbek Makhambet

Bassanova Nurgul

Sabitkhanov Askhat

Downloads last month
9
Safetensors
Model size
354M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for Arailym-aitu/roberta-large-kazqad-informatics_kaz

Finetuned
(1)
this model

Dataset used to train Arailym-aitu/roberta-large-kazqad-informatics_kaz