Safetensors
Persian
gemma2

Fine-tuned Gemma 2 2b on Persian

This a Gemma 2 2b model fined tuned on Persian instrucion datasets I found out in huggingface, due to the small size, could be used in various general tasks.

Inference

This is a fine-tuned version of Gemma 2 2b model, which means it could be used in same manner:

# pip install accelerate
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch

tokenizer = AutoTokenizer.from_pretrained("cnababaie/gemma2-2b-fa-ft")
model = AutoModelForCausalLM.from_pretrained(
    "cnababaie/gemma2-2b-fa-ft",
    device_map="auto",
)

input_text = "این متنو به انگلیسی ترجمه کن: خاک از شدت باران به گل تبدیل شد."
input_ids = tokenizer(input_text, return_tensors="pt").to("cuda")

outputs = model.generate(**input_ids, max_new_tokens=32)
print(tokenizer.decode(outputs[0]))

This will correctly translate as "The soil turned to mud from the heavy rain." whereas the original gemma 2 2b model would give wrong repetitive answer.

Downloads last month
4,323
Safetensors
Model size
2.61B params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for cnababaie/gemma2-2b-fa-ft

Base model

google/gemma-2-2b
Finetuned
(507)
this model
Quantizations
2 models

Datasets used to train cnababaie/gemma2-2b-fa-ft