3Simplex's picture
Update README.md
0cf652a verified
|
raw
history blame
740 Bytes
metadata
license: llama3.1
base_model: meta-llama/Meta-Llama-3.1-8B-Instruct
pipeline_tag: text-generation
tags:
  - text-generation-inference

At the time of this release, llama.cpp did not support the rope scaling required for full context (limit is 8192). Soon this will be updated for full 128K functionality.

image/png

Prompt Template

<|start_header_id|>system<|end_header_id|>

{system_prompt}<|eot_id|>
<|start_header_id|>user<|end_header_id|>

{user_input}<|eot_id|><|start_header_id|>assistant<|end_header_id|>

{assistant_response}

128k Context Length

"llama.context_length": 131072