|
--- |
|
license: apache-2.0 |
|
language: |
|
- en |
|
- es |
|
base_model: |
|
- meta-llama/Llama-3.2-1B |
|
tags: |
|
- lexic |
|
- smartloop |
|
datasets: |
|
- smartloop-ai/lexic-ai-tutorial-dataset |
|
--- |
|
# Model Card for Model ID |
|
|
|
Lexic.AI model fine-tuned from meta-llama-3.2-1B to used with the base model for the platform |
|
|
|
## Model Details |
|
|
|
### Model Description |
|
|
|
<!-- Provide a longer summary of what this model is. --> |
|
|
|
|
|
|
|
- **Developed by:** Smartloop Inc |
|
- **Language(s) (NLP):** English, Spanish |
|
|
|
- **Finetuned from model [optional]:** meta-llama/Llama-3.2-1B |
|
|
|
## Uses |
|
``` |
|
template = """<|begin_of_text|><|start_header_id|>system<|end_header_id|> |
|
You are a helpful LexicAI assistant<|eot_id|> |
|
<|start_header_id|>user<|end_header_id|> |
|
{query}<|eot_id|> |
|
<|start_header_id|>assistant<|end_header_id|>""" |
|
|
|
prompt_template = PromptTemplate( |
|
input_variables=['query'], |
|
template=template |
|
) |
|
|
|
query = "What are key features of Lexic AI?" |
|
|
|
start_time = time.time() |
|
print('Responding:', start_time) |
|
|
|
output = llm( |
|
prompt_template.format(query=query), # Prompt |
|
max_tokens=4096, # Generate up to 32 tokens, set to None to generate up to the end of the context window |
|
echo=False, # Echo the prompt back in the output |
|
temperature=0.0, |
|
) |
|
|
|
print('Completed:', (time.time() - start_time) / 60) |
|
|
|
print(output['choices'][0]['text']) |
|
|
|
``` |
|
|
|
Output: |
|
|
|
``` |
|
Key features of Lexic.AI include effort-less scalability, resource optimization, |
|
and top-tier performance under varying workloads, ensuring that the platform |
|
can handle large data volumes and numerous concurrent users without compromising performance. |
|
Additionally, the platform's security features, including |
|
end-to-end encryption for data at rest and in transit, and a robust firewall, |
|
provide an additional layer of protection for sensitive business information. |
|
These features enable the platform to deliver reliable, scalable, |
|
and secure solutions for various applications. |
|
``` |
|
|
|
### Direct Use |
|
|
|
This fine-tuned model is based on meta-llama 3.2 with Lexic AI-related contents, intended to use by https://smartloop.ai as a based model |
|
|
|
[More Information Needed] |
|
|
|
### Downstream Use [optional] |
|
|
|
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> |
|
|
|
[More Information Needed] |
|
|
|
### Out-of-Scope Use |
|
|
|
Not intended to be used outside the scope of the product |