File size: 1,551 Bytes
2a9bc28
 
80c65a6
 
 
2a9bc28
 
 
 
 
 
 
 
 
cc1f827
2a9bc28
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
80c65a6
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
---
license: mit
pipeline_tag: text-generation
tags:
- cortex.cpp
---
## Overview

Phi-4 model, a state-of-the-art 14B parameter Transformer designed for advanced reasoning, conversational AI, and high-quality text generation. Built on a mix of synthetic datasets, filtered public domain content, academic books, and Q&A datasets, Phi-4 ensures exceptional performance through data quality and alignment. It features a 16K token context length, trained on 9.8T tokens over 21 days using 1920 H100-80G GPUs. Phi-4 underwent rigorous fine-tuning and preference optimization to enhance instruction adherence and safety. Released on December 12, 2024, it represents a static model with data cutoff as of June 2024, suitable for diverse applications in research and dialogue systems.

## Variants

| No | Variant | Cortex CLI command |
| --- | --- | --- |
| 1 | [Phi-4-14b](https://huggingface.co/cortexso/phi-4/tree/14b) | `cortex run phi-4:14b` |

## Use it with Jan (UI)

1. Install **Jan** using [Quickstart](https://jan.ai/docs/quickstart)
2. Use in Jan model Hub:
    ```text
    cortexso/phi-4
    ```

## Use it with Cortex (CLI)

1. Install **Cortex** using [Quickstart](https://cortex.jan.ai/docs/quickstart)
2. Run the model with command:
    ```bash
    cortex run phi-4
    ```

## Credits

- **Author:** Microsoft Research
- **Converter:** [Homebrew](https://www.homebrew.ltd/)
- **Original License:** [License](https://huggingface.co/microsoft/phi-4/blob/main/LICENSE)
- **Papers:** [Phi-4 Technical Report](https://arxiv.org/pdf/2412.08905)