|
--- |
|
license: cc-by-4.0 |
|
language: mr |
|
--- |
|
|
|
## MahaGemma |
|
MahaGemma is a Marathi Gemma model. It is a Gemma 7B (google/gemma-7b) model LoRA fine-tuned on a translated Marathi instruction tuning datasets. |
|
[dataset link] (https://github.com/l3cube-pune/MarathiNLP) |
|
|
|
This is part of the MahaNLP initiative. More details coming soon. <br> |
|
|
|
Citing |
|
``` |
|
@article{joshi2022l3cube, |
|
title={L3cube-mahanlp: Marathi natural language processing datasets, models, and library}, |
|
author={Joshi, Raviraj}, |
|
journal={arXiv preprint arXiv:2205.14728}, |
|
year={2022} |
|
} |
|
|
|
``` |