YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Quantization made by Richard Erkhov.
fine-tuned-gpt-neo - bnb 8bits
- Model creator: https://huggingface.co/Torrchy/
- Original model: https://huggingface.co/Torrchy/fine-tuned-gpt-neo/
Original model description:
license: mit language: - en base_model: - EleutherAI/gpt-neo-1.3B library_name: transformers
Fine-tuned GPT-Neo Model
This is a fine-tuned version of GPT-Neo for specific tasks.
Model Details
- Model Type: GPT-Neo
- Fine-tuned for: [Specify tasks or datasets]
Usage
To use the model, run the following code:
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("Torrchy/fine-tuned-gpt-neo")
tokenizer = AutoTokenizer.from_pretrained("Torrchy/fine-tuned-gpt-neo")
- Downloads last month
- 0
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model's library.