opt-125m-gptq-2bitQuantization / special_tokens_map.json
Sujan42024's picture
Upload tokenizer
5c0510d
raw
history blame contribute delete
230 Bytes
{
"additional_special_tokens": [
"### End",
"### Instruction:",
"### Response:\n"
],
"bos_token": "<|endoftext|>",
"eos_token": "<|endoftext|>",
"pad_token": "<|endoftext|>",
"unk_token": "<|endoftext|>"
}