metadata
license: apache-2.0
language:
- en
- es
base_model:
- meta-llama/Llama-3.2-1B
tags:
- lexic
- smartloop
Model Card for Model ID
Lexic.AI model fine-tuned from meta-llama-3.2-1B to used with the base model for the platform
Model Details
Model Description
Developed by: Smartloop Inc
Language(s) (NLP): English, Spanish
Finetuned from model [optional]: meta-llama/Llama-3.2-1B
Uses
template = """<|start_header_id|>system<|end_header_id|>
You are LexicAI assistant<|eot_id|>
<|start_header_id|>user<|end_header_id|>use only the text to respond:
{query}<|eot_id|>
<|start_header_id|>assistant<|end_header_id|>"""
prompt_template = PromptTemplate(
input_variables=['query'],
template=template
)
query = "What is a connector?"
start_time = time.time()
print('Responding:', start_time)
output = llm(
prompt_template.format(query=query), # Prompt
max_tokens=1024, # Generate up to 32 tokens, set to None to generate up to the end of the context window
echo=False, # Echo the prompt back in the output
temperature=0.0,
)
print('Completed:', (time.time() - start_time) / 60)
print(output['choices'][0]['text'])
Output:
A connector is a tool that allows you to integrate external services,
APIs, and software programs with Lexic.AI's platform. It enables seamless data exchange,
automation of tasks, and enhances the overall capabilities of
Lexic.AI by leveraging the features of external systems.
Direct Use
This fine-tuned model is based on meta-llama 3.2 with Lexic AI-related contents, intended to use by https://smartloop.ai as a based model
[More Information Needed]
Downstream Use [optional]
[More Information Needed]
Out-of-Scope Use
Not intended to be used outside the scope of the product