emrgnt-cmplxty
commited on
Commit
•
751aff5
1
Parent(s):
9b3c127
Update README.md
Browse files
README.md
CHANGED
@@ -12,7 +12,39 @@ Benchmark Results:
|
|
12 |
SciPhi-AI is available via a free hosted API, though the exposed model can vary. Currently, SciPhi-Self-RAG-Mistral-7B-32k is available. More details can be found in the docs [here](https://sciphi.readthedocs.io/en/latest/setup/quickstart.html).
|
13 |
|
14 |
|
|
|
15 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
16 |
## Model Architecture
|
17 |
|
18 |
Base Model: Mistral-7B-v0.1
|
|
|
12 |
SciPhi-AI is available via a free hosted API, though the exposed model can vary. Currently, SciPhi-Self-RAG-Mistral-7B-32k is available. More details can be found in the docs [here](https://sciphi.readthedocs.io/en/latest/setup/quickstart.html).
|
13 |
|
14 |
|
15 |
+
## Recommended Chat Formatting
|
16 |
|
17 |
+
```python
|
18 |
+
|
19 |
+
|
20 |
+
def get_chat_completion(
|
21 |
+
self, conversation: list[dict], generation_config: GenerationConfig
|
22 |
+
) -> str:
|
23 |
+
self._check_stop_token(generation_config.stop_token)
|
24 |
+
prompt = ""
|
25 |
+
added_system_prompt = False
|
26 |
+
for message in conversation:
|
27 |
+
if message["role"] == "system":
|
28 |
+
prompt += f"### System:\n{SciPhiLLMInterface.ALPACA_CHAT_SYSTEM_PROMPT}. Further, the assistant is given the following additional instructions - {message['content']}\n\n"
|
29 |
+
added_system_prompt = True
|
30 |
+
elif message["role"] == "user":
|
31 |
+
last_user_message = message["content"]
|
32 |
+
prompt += f"### Instruction:\n{last_user_message}\n\n"
|
33 |
+
elif message["role"] == "assistant":
|
34 |
+
prompt += f"### Response:\n{message['content']}\n\n"
|
35 |
+
|
36 |
+
if not added_system_prompt:
|
37 |
+
prompt = f"### System:\n{SciPhiLLMInterface.ALPACA_CHAT_SYSTEM_PROMPT}.\n\n{prompt}"
|
38 |
+
|
39 |
+
context = self.rag_interface.get_contexts([last_user_message])[0]
|
40 |
+
prompt += f"### Response:\n{SciPhiFormatter.RETRIEVAL_TOKEN} {SciPhiFormatter.INIT_PARAGRAPH_TOKEN}{context}{SciPhiFormatter.END_PARAGRAPH_TOKEN}"
|
41 |
+
latest_completion = self.model.get_instruct_completion(
|
42 |
+
prompt, generation_config
|
43 |
+
).strip()
|
44 |
+
|
45 |
+
return SciPhiFormatter.remove_cruft(latest_completion)
|
46 |
+
|
47 |
+
```
|
48 |
## Model Architecture
|
49 |
|
50 |
Base Model: Mistral-7B-v0.1
|