Diana-7B
This is Diana-7b, rated 93.56/100 by GPT-4 on a collection of 30 synthetic prompts generated by GPT-4.
Diana stands for Deep Insight and Analytical Narrative Assistant and is a merge of the following models using mergekit:
- mlabonne/AlphaMonarch-7B: This model has impressive conversational abilities, formal and sophisticated style, and strong reasoning skills.
- sethuiyer/Aika-7b: A merge of SanjiWatsuki/Silicon-Maid-7B, Guilherme34/Samantha-v2, jan-hq/stealth-v1.3, and senseable/WestLake-7B-v2, Aika-7b is designed for natural and human-like interactions, accurate information delivery, comprehensive analysis, emotional intelligence, clarity, and structure.
- SanjiWatsuki/Silicon-Maid-7B: This model is known for its excellent multi-turn conversational skills and logical coherence.
- sethuiyer/Nandine-7b: A merge of senseable/Westlake-7B, Guilherme34/Samantha-v2, and uukuguy/speechless-mistral-six-in-one-7b, Nandine-7b excels in narrative skill, empathetic interaction, intellectual depth, and eloquent communication.
By combining these models, Diana-7B offers a balanced blend of capabilities, making it suitable for various tasks and providing a comprehensive AI companion for creative writing, thoughtful discussions, problem-solving, and general assistance.
OpenLLM Benchmark
Model | Average ⬆️ | ARC | HellaSwag | MMLU | TruthfulQA | Winogrande | GSM8K |
---|---|---|---|---|---|---|---|
sethuiyer/Diana-7B 📑 | 70.6 | 68.34 | 86.73 | 64.58 | 60.55 | 80.19 | 63.23 |
Nous Benchmark
Model | AGIEval | GPT4All | TruthfulQA | Bigbench | Average |
---|---|---|---|---|---|
Diana-7B | 44.38 | 75.1 | 60.55 | 44.58 | 56.09 |
Configuration
The following YAML configuration was used to produce this model:
base_model: mlabonne/AlphaMonarch-7B
dtype: bfloat16
merge_method: dare_ties
models:
- model: mlabonne/AlphaMonarch-7B
- model: sethuiyer/Aika-7B
parameters:
density: 0.85
weight: 0.30
- model: SanjiWatsuki/Silicon-Maid-7B
parameters:
density: 0.85
weight: 0.50
- model: sethuiyer/Nandine-7b
parameters:
density: 0.85
weight: 0.30
parameters:
int8_mask: true
Prompt Template
{bos}user
{ .Prompt }{eos}
{bos}assistant
GGUF
GGUF files are available at Diana-7B-GGUF
Ollama
Diana is now available on Ollama. You can use it by running the command ollama run stuehieyr/diana
in your
terminal. If you have limited computing resources, check out this video to learn how to run it on
a Google Colab backend.
- Downloads last month
- 81
Model tree for sethuiyer/Diana-7B
Evaluation results
- normalized accuracy on AI2 Reasoning Challenge (25-Shot)test set Open LLM Leaderboard68.340
- normalized accuracy on HellaSwag (10-Shot)validation set Open LLM Leaderboard86.730
- accuracy on MMLU (5-Shot)test set Open LLM Leaderboard64.580
- mc2 on TruthfulQA (0-shot)validation set Open LLM Leaderboard60.550
- accuracy on Winogrande (5-shot)validation set Open LLM Leaderboard80.190
- accuracy on GSM8k (5-shot)test set Open LLM Leaderboard63.230