metadata
license: apache-2.0
To use with LocalAI this is an example model definition:
name: WizardLM2
backend: transformers
parameters:
model: fakezeta/Not-WizardLM-2-7B-ov-int4
context_size: 8192
threads: 6
f16: true
type: OVModelForCausalLM
prompt_cache_path: "cache"
prompt_cache_all: true
template:
use_tokenizer_template: true