traclm-v2
Collection
5 items
•
Updated
This repo contains a GPTQ quantization of TRAC-MTRY/traclm-v2-7b-instruct for utilization of the model on low-resource hardware.
Read more about GPTQ quantization here.
Read more about the unquantized model here.
This model was fine-tuned with the alpaca prompt format. It is highly recommended that you use the same format for any interactions with the model. Failure to do so will degrade performance significantly.
Standard Alpaca Format:
### System:\nBelow is an instruction that describes a task. Write a response that appropriately completes the request.\n\n\n\n### Instruction:\n{prompt}\n\n### Response:\n "
Input Field Variant:
### System:\nBelow is an instruction that describes a task. Write a response that appropriately completes the request.\n\n\n\n### Instruction:\n{prompt}\n\n###Input:\n{input}\n\n### Response:\n "