reduced size
#3
by
Arman-Sadeghi
- opened
Hi there, how did you reduce the size and made it multilingual?
I mean, how can I make it focus on being multilingual(maximum 4 languages) in order to make it smaller and still a nice chatbot?
Hey Arman,
It is based on a 0.5B model (Qwen/Qwen2.5-0.5B-Instruct) so that is how the size was set.
I am unsure as to how much smaller you could make it by focusing only on a few languages - there is as much of an argument for cross-lingual transfer learning as there is for more focused training.
If you'd like to chat more about this, feel free to PM me on LinkedIn (https://www.linkedin.com/in/peter-devine-030b2012a/) and we can talk more about your problem.
Thanks,
Peter