|
--- |
|
license: llama3 |
|
tags: |
|
- gguf |
|
- llama3 |
|
- ollama |
|
- dolphin |
|
base_model: cognitivecomputations/dolphin-2.9-llama3-70b |
|
--- |
|
|
|
# Dolphin 2.9 Llama 3 70b 🐬 (GGUF) |
|
|
|
Quantized and converted to GGUF for Ollama. |
|
|
|
|
|
## Example Modelfile |
|
|
|
``` |
|
FROM ./dolphin-2.9-llama3-70b-Q6_K.gguf |
|
|
|
TEMPLATE """{{ if .System }}<|im_start|>system |
|
{{ .System }}<|im_end|> |
|
{{ end }}{{ if .Prompt }}<|im_start|>user |
|
{{ .Prompt }}<|im_end|> |
|
{{ end }}<|im_start|>assistant |
|
{{ .Response }}<|im_end|> |
|
""" |
|
SYSTEM """You are Dolphin, a helpful AI assistant. |
|
""" |
|
PARAMETER num_ctx 8192 |
|
PARAMETER stop "<|im_start|>" |
|
PARAMETER stop "<|im_end|>" |
|
``` |
|
|
|
## Provided files |
|
| Name | Quant method | Size | |
|
| ---- | ---- | ---- | |
|
| [dolphin-2.9-llama3-70b-Q6_K.gguf-part00000](https://huggingface.co/TrabEsrever/dolphin-2.9-llama3-70b-GGUF/blob/main/dolphin-2.9-llama3-70b-Q6_K.gguf-part00000) [dolphin-2.9-llama3-70b-Q6_K.gguf-part00001](https://huggingface.co/TrabEsrever/dolphin-2.9-llama3-70b-GGUF/blob/main/dolphin-2.9-llama3-70b-Q6_K.gguf-part00001) | Q6_K | 54G | |
|
| [dolphin-2.9-llama3-70b-Q8_0.gguf-part00000](https://huggingface.co/TrabEsrever/dolphin-2.9-llama3-70b-GGUF/blob/main/dolphin-2.9-llama3-70b-Q8_0.gguf-part00000) [dolphin-2.9-llama3-70b-Q8_0.gguf-part00001](https://huggingface.co/TrabEsrever/dolphin-2.9-llama3-70b-GGUF/blob/main/dolphin-2.9-llama3-70b-Q8_0.gguf-part00001) | Q8_0 | 70G | |
|
|
|
|
|
## Combine multi part files into one GGUF |
|
|
|
``` |
|
cat dolphin-2.9-llama3-70b-Q6_K.gguf-part00000 dolphin-2.9-llama3-70b-Q6_K.gguf-part00001 > dolphin-2.9-llama3-70b-Q6_K.gguf |
|
|
|
``` |
|
|