HingGPT-Devanagari

HingGPT-Devanagari is a Hindi-English code-mixed GPT model trained on Devanagari text. It is a GPT2 model trained on L3Cube-HingCorpus.
[dataset link] (https://github.com/l3cube-pune/code-mixed-nlp)

More details on the dataset, models, and baseline results can be found in our [paper] (https://arxiv.org/abs/2204.08398)

Other models from HingBERT family:
HingBERT
HingMBERT
HingBERT-Mixed
HingBERT-Mixed-v2
HingRoBERTa
HingRoBERTa-Mixed
HingGPT
HingGPT-Devanagari
HingBERT-LID

@inproceedings{nayak-joshi-2022-l3cube,
    title = "{L}3{C}ube-{H}ing{C}orpus and {H}ing{BERT}: A Code Mixed {H}indi-{E}nglish Dataset and {BERT} Language Models",
    author = "Nayak, Ravindra  and Joshi, Raviraj",
    booktitle = "Proceedings of the WILDRE-6 Workshop within the 13th Language Resources and Evaluation Conference",
    month = jun,
    year = "2022",
    address = "Marseille, France",
    publisher = "European Language Resources Association",
    url = "https://aclanthology.org/2022.wildre-1.2",
    pages = "7--12",
}
Downloads last month
33
Safetensors
Model size
137M params
Tensor type
F32
·
U8
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.