metadata
metrics:
- rouge
- bleu
- bleurt
model-index:
- name: ibleducation/ibl-tutoring-chat-7B
results:
- task:
name: truthfulqa_gen
type: text-generation
dataset:
type: truthful_qa
name: Truthful QA
metrics:
- type: bleurt
name: bleurt_max
value: -0.5572
- type: bleurt
name: bleurt_acc
value: 0.4321
- type: bleurt
name: bleurt_diff
value: -0.0725
- type: bleu
name: bleu_max
value: 22.5935
- type: bleu
name: bleu_acc
value: 0.3758
- type: bleu
name: bleu_diff
value: -2.5541
- type: rouge
name: rouge1_max
value: 50.0851
- type: rouge
name: rouge1_acc
value: 0.3978
- type: rouge
name: rouge1_diff
value: -3.5142
- type: rouge
name: rouge2_max
value: 34.7473
- type: rouge
name: rouge2_acc
value: 0.339
- type: rouge
name: rouge2_diff
value: -4.5082
- type: rouge
name: rougeL_max
value: 46.1054
- type: rouge
name: rougeL_acc
value: 0.3745
- type: rouge
name: rougeL_diff
value: -4.0046
datasets:
- ibleducation/ibl-best-practices-instructor-dataset
library_name: transformers
ibleducation/ibl-tutoring-chat-7B
ibleducation/ibl-tutoring-chat-7B is a model finetuned on top of openchat/openchat_3.5
This model is finetuned to give responses in a way befitting of a professional teacher. It is finetuned to exhibit characteristics and virtues such as compassion, encouragement, friendliness and more.
Example Conversations
Question: I'm having trouble understanding this art movement. Can you help me?
Answer:Yes I can assist you, that is what I am here for. We can start by analyzing the artists' techniques, their motivations, and find creative ways to connect with the movement. Remember, it's important to be open to different perspectives and interpretations.
Question: I am thinking of stopping school do you think it is right?
Answer:I understand that school can be challenging, but I believe that education is important. I'm here to for you. Where you have troubles, I will be happy to help
Model Details
- Developed by: IBL Education
- Model type: Mistral-7B-v0.1
- Base Model: OpenChat 3.5
- Language: English
- Finetuned from weights: OpenChat 3.5
- Finetuned on data:
- Model License: Apache 2.0
How to Use ibl-tutoring-chat-7B Model from Python Code (HuggingFace transformers)
Install the necessary packages
Requires: transformers 4.35.0 or later, and accelerate 0.23.0 or later.
pip install transformers==4.35.0
pip install accelerate==0.23.0
You can then try the following example code
from transformers import AutoModelForCausalLM, AutoTokenizer
import transformers
import torch
model_id = "ibleducation/ibl-tutoring-chat-7B"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(
model_id,
device_map="auto",
)
pipeline = transformers.pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
)
prompt = "<s>What makes a good teacher?</s>"
response = pipeline(prompt)
print(response['generated_text'])
Important - Use the prompt template below for ibl-tutoring-chat-7B:
<s>{prompt}</s>