Update README.md
Browse files
README.md
CHANGED
@@ -64,5 +64,87 @@ model-index:
|
|
64 |
library_name: transformers
|
65 |
---
|
66 |
|
|
|
|
|
67 |
|
68 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
64 |
library_name: transformers
|
65 |
---
|
66 |
|
67 |
+
# ibleducation/ibl-tutoring-llm-openchat
|
68 |
+
ibleducation/ibl-tutoring-llm-openchat is a model finetuned on top of openchat/openchat_3.5
|
69 |
|
70 |
+
This model is finetuned to give responses in a way befitting of a professional teacher
|
71 |
+
|
72 |
+
|
73 |
+
## Example Conversations
|
74 |
+
1. Question: I'm having trouble understanding this art movement. Can you help me? \
|
75 |
+
Answer:
|
76 |
+
```code
|
77 |
+
Yes I can assist you, that is what I am here for. We can start by analyzing the artists' techniques, their motivations, and find creative ways to connect with the movement. Remember, it's important to be open to different perspectives and interpretations.
|
78 |
+
```
|
79 |
+
|
80 |
+
2. Question: I am thinking of stopping school do you think it is right? \
|
81 |
+
Answer:
|
82 |
+
```code
|
83 |
+
I understand that school can be challenging, but I believe that education is important. I'm here to for you. Where you have troubles, I will be happy to help
|
84 |
+
```
|
85 |
+
|
86 |
+
|
87 |
+
## Motivation of Developing ibl-tutoring-llm-32k Model
|
88 |
+
|
89 |
+
Students today use llm's in their learning and research. However, most models are not trained to behave and respond to conversations with the virtues a teacher must possess. ibl-tutoring-llm-32k Model is fine tuned
|
90 |
+
on top of amazon/Mistrallite to alter its behaviour to converse the way a teacher should
|
91 |
+
|
92 |
+
|
93 |
+
## Model Details
|
94 |
+
|
95 |
+
- **Developed by:** [IBL Education](https://ibl.ai)
|
96 |
+
- **Model type:** [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1)
|
97 |
+
- **Base Model:** [OpenChat 3.5](https://huggingface.co/openchat/openchat_3.5)
|
98 |
+
- **Language:** English
|
99 |
+
- **Finetuned from weights:** [OpenChat 3.5](https://huggingface.co/openchat/openchat_3.5)
|
100 |
+
- **Finetuned on data:**
|
101 |
+
- IBL-tutoring-dataset (private)
|
102 |
+
- **Model License:** Apache 2.0
|
103 |
+
|
104 |
+
## How to Use ibl-tutoring-llm-openchat Model from Python Code (HuggingFace transformers) ##
|
105 |
+
|
106 |
+
### Install the necessary packages
|
107 |
+
|
108 |
+
Requires: [transformers](https://pypi.org/project/transformers/) 4.34.0 or later, [flash-attn](https://pypi.org/project/flash-attn/) 2.3.1.post1 or later,
|
109 |
+
and [accelerate](https://pypi.org/project/accelerate/) 0.23.0 or later.
|
110 |
+
|
111 |
+
```shell
|
112 |
+
pip install transformers==4.34.0
|
113 |
+
pip install accelerate==0.23.0
|
114 |
+
```
|
115 |
+
### You can then try the following example code
|
116 |
+
|
117 |
+
```python
|
118 |
+
from transformers import AutoModelForCausalLM, AutoTokenizer
|
119 |
+
import transformers
|
120 |
+
import torch
|
121 |
+
|
122 |
+
model_id = "ibleducation/ibl-tutoring-llm-openchat"
|
123 |
+
|
124 |
+
tokenizer = AutoTokenizer.from_pretrained(model_id)
|
125 |
+
model = AutoModelForCausalLM.from_pretrained(
|
126 |
+
model_id,
|
127 |
+
device_map="auto",
|
128 |
+
)
|
129 |
+
pipeline = transformers.pipeline(
|
130 |
+
"text-generation",
|
131 |
+
model=model,
|
132 |
+
tokenizer=tokenizer,
|
133 |
+
)
|
134 |
+
prompt = "<s>What makes a good teacher?</s>"
|
135 |
+
|
136 |
+
sequences = pipeline(
|
137 |
+
prompt,
|
138 |
+
max_new_tokens=400,
|
139 |
+
do_sample=False,
|
140 |
+
return_full_text=False,
|
141 |
+
num_return_sequences=1,
|
142 |
+
eos_token_id=tokenizer.eos_token_id,
|
143 |
+
)
|
144 |
+
for seq in sequences:
|
145 |
+
print(f"{seq['generated_text']}")
|
146 |
+
```
|
147 |
+
**Important** - Use the prompt template below for ibl-tutoring-llm-32k:
|
148 |
+
```
|
149 |
+
<s>{prompt}</s>
|
150 |
+
```
|