Muhamamd Aun Ali
Update README.md
2469018 verified
metadata
library_name: transformers
tags:
  - transformers
  - text-generation
  - conversational
license: apache-2.0
datasets:
  - marmikpandya/mental-health

Model Name: Gemma-2-2B-IT

This is a fine-tuned model based on Gemma-2-2B-IT, optimized for text generation tasks. It is compatible with Hugging Face's transformers library.

Fine-Tuning Details

  • Dataset: Mental Health / Therapist Dataset
  • Method: LoRA for low-resource adaptation

Usage Example

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("YourModelRepoName")
tokenizer = AutoTokenizer.from_pretrained("YourModelRepoName")

input_text = "What is the meaning of life?"
inputs = tokenizer(input_text, return_tensors="pt")
outputs = model.generate(**inputs)
print(tokenizer.decode(outputs[0]))