This model pairs with the PRETRAINED variant of amharic llama. Available here: https://huggingface.co/iocuydi/llama-2-amharic-3784m/tree/main/pretrained
It will also require Llama2 weights and this clip model: https://huggingface.co/openai/clip-vit-large-patch14-336
More information on running the model here: https://github.com/iocuydi/amharic-llama-llava
Cite:
@misc{andersland2024amharic,
title={Amharic LLaMA and LLaVA: Multimodal LLMs for Low Resource Languages},
author={Michael Andersland},
year={2024},
eprint={2403.06354},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model's library.