IcaroLM, is a language model based on Qwen2 1.5B, designed for mobile efficiency, empathetic chat, and function calling.
This model is optimized for fast inference and low resource consumption on mobile devices, providing a seamless and responsive user experience.
Icaro-LM is fine-tuned for empathetic conversations and can understand and execute function calls within the conversation flow,
making it a versatile solution for various applications.
Key Features:
- Mobile Efficiency: Optimized for fast inference and low resource consumption on mobile devices.
- Empathetic Chat: Fine-tuned on datasets curated for empathetic and emotionally intelligent conversations.
- Function Calling: Can understand and execute function calls within the conversation flow.
Use Cases:
- Mobile chatbots and virtual assistants
- Emotional support applications
- Task automation on mobile devices
Prompt format
<|im_start|>system
{system_prompt}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
Function calling example
<|im_start|>system
You are a helpful assistant with access to the following functions. Use them if required -[{
"name":"get_news",
"description":"Get the latest news.",
"parameters":{
"type":"object",
"properties":{
"location":{
"type":"string",
"description":"The location for which to fetch news"
}
},
"required":[
"location"
]
}
},
{
"name": "get_current_weather",
"description": "Get the current weather",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA"
},
},
"required": ["location"],
},
}]<|im_end|>
<|im_start|>user
What's the latest news in Samara?<|im_end|>
<|im_start|>assistant
Result:
<|im_start|>assistant
<functioncall> {"name": "get_news", "arguments": '{"location": "Samara"}'} <|im_end|>
- Downloads last month
- 222
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.