Thank you!
#1
by
DevElCuy
- opened
I tried multiple GGUF models, including other phi 1.5b but none seemed to work. Your model is a gem, small size, great quality, fast inference even for low budget/old GPU.
I'm using it with guidance, like this:
from guidance import models, gen, select
llm = models.LlamaCpp('models/TKDKid1000--phi-1_5-Q6_K.gguf', n_gpu_layers=-1)
query="SUPR"
options=["SuperDapp","Silver Port"]
lm = llm + f'''\
What is the name of token symbol "{query}"?
{select(options, name='answer')}'''
print(lm["answer"])
Thank you!