WebLLM Phi 3.5 Chat
This space enables AI chat with Phi 3.5 models directly in your local browser, empowered by WebLLM.
Step 1: Configure And Download Model
Quantization:
q4f16
q4f32
Context Window:
1k
2k
4k
8k
16k
32k
64k
128k
Temperature:
1.00
Top-p:
1.00
Presence Penalty:
0.00
Frequency Penalty:
0.00
Download
Step 2: Chat
Send