Text Generation
Safetensors
English
Tamil
gpt2
QnQ

QnQGPT Model

This is a custom GPT model based on GPT-2 architecture.

Model Details

  • Model Type: GPT-2
  • Base Model: gpt2
  • Training Data: [Describe your training data]
  • Use Cases: [Describe intended use cases]

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("karthikqnq/qnqgpt")
tokenizer = AutoTokenizer.from_pretrained("karthikqnq/qnqgpt")

# Generate text
text = "Hello, how are"
inputs = tokenizer(text, return_tensors="pt")
outputs = model.generate(**inputs, max_length=50)
result = tokenizer.decode(outputs[0])
print(result)

Training Details

[Add your training details here]

Limitations

[Add model limitations here]

License

This model is released under the MIT License.

Downloads last month
18
Safetensors
Model size
124M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for karthikqnq/qnqgpt2

Finetuned
(1421)
this model
Finetunes
1 model

Datasets used to train karthikqnq/qnqgpt2

Space using karthikqnq/qnqgpt2 1