MahaGemma-7B

MahaGemma-7B is a Marathi Gemma model. It is a Gemma 7B (google/gemma-7b) model LoRA fine-tuned on translated Marathi datasets. [dataset link] (https://github.com/l3cube-pune/MarathiNLP)

This is part of the MahaNLP initiative. More details coming soon.

Prompt format:

<bos>\n### Instruction:\nमहाराष्ट्राची राजधानी काय आहे?\n\n### Input:\n\n\n### Response:\nमहाराष्ट्राची राजधानी मुंबई आहे

Citing

@article{joshi2022l3cube,
  title={L3cube-mahanlp: Marathi natural language processing datasets, models, and library},
  author={Joshi, Raviraj},
  journal={arXiv preprint arXiv:2205.14728},
  year={2022}
}

Model Family:
MahaGemma-2B
MahaGemma-7B

Downloads last month
17
Safetensors
Model size
8.54B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for l3cube-pune/marathi-gpt-gemma-7b

Quantizations
1 model

Space using l3cube-pune/marathi-gpt-gemma-7b 1