You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

Model Card for Model ID

Instruction-tuned version of llm/jp/llm-jp-3-13b.

Model Details

Model Description

  • Developed by: Koutaro Hachiya
  • Language(s) (NLP): Japanese
  • License: apache-2.0
  • Finetuned from model : llm-jp/llm-jp-3-13b

How to Get Started with the Model

from peft import PeftModel
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("llm-jp/llm-jp-3-13b")
model = AutoModelForCausalLM.from_pretrained("llm-jp/llm-jp-3-13b", device_map="auto", torch_dtype=torch.bfloat16)
model = PeftModel.from_pretrained(model, "khachiya/llm-jp-3-13b-finetune")
text = "自然言語処理とは何か"
tokenized_input = tokenizer.encode(text, add_special_tokens=False, return_tensors="pt").to(model.device)
with torch.no_grad():
    output = model.generate(
        tokenized_input,
        max_new_tokens=100,
        do_sample=True,
        top_p=0.95,
        temperature=0.7,
        repetition_penalty=1.05,
    )[0]
print(tokenizer.decode(output))

Training Details

Training Data

The following data set is used from "https://liat-aip.sakura.ne.jp/wp/llmのための日本語インストラクションデータ作成/llmのための日本語インストラクションデータ-公開/"

ichikara-instruction-003-001-1.json

Model Card Contact

k.hachiyagmail.com

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for khachiya/llm-jp-3-13b-finetune

Finetuned
(1137)
this model