File size: 2,750 Bytes
afd00da
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
16d9390
afd00da
66351b0
afd00da
54d1366
953bf69
 
 
88c30a5
953bf69
 
54d1366
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f07ab7d
54d1366
 
 
 
 
 
 
 
 
66351b0
 
afd00da
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
---
license: other
license_name: qwen-research
license_link: https://huggingface.co/Qwen/Qwen2.5-3B/blob/main/LICENSE
datasets:
- OpenCoder-LLM/opc-sft-stage1
- OpenCoder-LLM/opc-sft-stage2
- microsoft/orca-agentinstruct-1M-v1
- microsoft/orca-math-word-problems-200k
- NousResearch/hermes-function-calling-v1
- AI-MO/NuminaMath-CoT
- AI-MO/NuminaMath-TIR
- allenai/tulu-3-sft-mixture
- cognitivecomputations/dolphin-coder
- HuggingFaceTB/smoltalk
- cognitivecomputations/samantha-data
- m-a-p/CodeFeedback-Filtered-Instruction
- m-a-p/Code-Feedback
language:
- en
base_model: cognitivecomputations/Dolphin3.0-Qwen2.5-3b
tags:
- llama-cpp
---

# IntelligentEstate/Dolphin3.0_QwenStar-3B-Q8-GGUF

![dolphin qstar.png](https://cdn-uploads.huggingface.co/production/uploads/6593502ca2607099284523db/_-42RD6RGPB-BZ51evsNc.png)


## GPT4ALL usage

## Prompt Sys Message
You are a helpful assistant who answers in 2 parts, Part 1: you evaluate the query, Identify(seperately) all essential parts of the question and what equation or process you will need to answer; Part 2: Use a step by step approach to answer the question the best you can, use tools if needed.

## Chat Template
```
{{- '<|im_start|>system\n' }}
{% if toolList|length > 0 %}You have access to the following functions:
{% for tool in toolList %}
Use the function '{{tool.function}}' to: '{{tool.description}}'
{% if tool.parameters|length > 0 %}
parameters:
{% for info in tool.parameters %}
  {{info.name}}:
    type: {{info.type}}
    description: {{info.description}}
    required: {{info.required}}
{% endfor %}
{% endif %}
# Tool Instructions
If you CHOOSE to call this function ONLY reply with the following format:
'{{tool.symbolicFormat}}'
Here is an example. If the user says, '{{tool.examplePrompt}}', then you reply
'{{tool.exampleCall}}'
After the result you might reply with, '{{tool.exampleReply}}'
{% endfor %}
You MUST include both the start and end tags when you use a function.

You are a helpful and considerate AI assistant from Intelligent Estate who uses the functions to break down, analyze, perform, and verify complex reasoning tasks. You SHOULD try to verify your answers using the functions where possible.
{% endif %}
{{- '<|im_end|>\n' }}
{% for message in messages %}
{{'<|im_start|>' + message['role'] + '\n' + message['content'] + '<|im_end|>' + '\n' }}
{% endfor %}
{% if add_generation_prompt %}
{{ '<|im_start|>assistant\n' }}
{% endif %}

This model was converted to GGUF format from [`cognitivecomputations/Dolphin3.0-Qwen2.5-3b`](https://huggingface.co/cognitivecomputations/Dolphin3.0-Qwen2.5-3b) using llama.cpp
Refer to the [original model card](https://huggingface.co/cognitivecomputations/Dolphin3.0-Qwen2.5-3b) for more details on the model.