File size: 1,830 Bytes
d55aef0
 
ba7024f
 
 
d55aef0
 
 
 
 
 
 
 
 
 
 
b987bf2
d55aef0
 
 
 
 
ba7024f
d55aef0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
b987bf2
ba7024f
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
---
license: llama3
pipeline_tag: text-generation
tags:
- cortex.cpp
---
## Overview

**Nous Research** developed and released the [Hermes 3](https://huggingface.co/NousResearch/Hermes-3-Llama-3.2-3B), a state-of-the-art instruction-tuned language model built on Llama-3.2-3B. This 3-billion parameter model is a fine-tuned version of Llama-3.2 and represents a leap forward in reasoning, multi-turn conversation, and structured outputs. It incorporates advanced role-playing capabilities, reliable function calling, and improved coherence over long contexts, making it a versatile assistant for various applications.

Hermes 3 was trained with high-quality data, leveraging fine-tuning techniques on H100 GPUs via LambdaLabs GPU Cloud. The model excels in both general-purpose and specialized tasks, including code generation, reasoning, and advanced conversational abilities. With support for ChatML prompt formatting, Hermes 3 ensures compatibility with OpenAI endpoints and facilitates structured, steerable interactions for end-users.

## Variants

| No | Variant | Cortex CLI command |
| --- | --- | --- |
| 1 | [Hermes3-3b](https://huggingface.co/cortexso/hermes3/tree/main) | `cortex run hermes3:3b` |

## Use it with Jan (UI)

1. Install **Jan** using [Quickstart](https://jan.ai/docs/quickstart)
2. Use in Jan model Hub:
    ```bash
    cortexso/hermes3
    ```

## Use it with Cortex (CLI)

1. Install **Cortex** using [Quickstart](https://cortex.jan.ai/docs/quickstart)
2. Run the model with command:
    ```bash
    cortex run hermes3
    ```

## Credits

- **Author:** Nous Research
- **Converter:** [Homebrew](https://www.homebrew.ltd/)
- **Original License:** [License](https://huggingface.co/meta-llama/Meta-Llama-3-8B/blob/main/LICENSE)
- **Papers:** [Hermes 3 Technical Report](https://arxiv.org/pdf/2408.11857)