MahaBERT-SQuAD

MahaBERT-SQuAD is a MahaBERT model fine-tuned on the translated Marathi question-answering dataset L3Cube-MahaSQuAD. [dataset link] (https://github.com/l3cube-pune/MarathiNLP)

More details on the dataset, models, and baseline results can be found in our [paper] (https://arxiv.org/abs/2404.13364)

@article{ghatage2024mahasquad,
  title={MahaSQuAD: Bridging Linguistic Divides in Marathi Question-Answering},
  author={Ghatage, Ruturaj and Kulkarni, Aditya and Patil, Rajlaxmi and Endait, Sharvi and Joshi, Raviraj},
  journal={arXiv preprint arXiv:2404.13364},
  year={2024}
}
Downloads last month
9
Safetensors
Model size
237M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.