BLLAMA / README.md
DESUCLUB's picture
Update README.md
caf93c1
|
raw
history blame
953 Bytes
metadata
license: apache-2.0
title: 'BLLAMA: ALPACA with BLIP 2'
sdk: gradio
emoji: πŸ”₯
colorFrom: red
colorTo: purple
pinned: true
app_file: generate.py

πŸ¦™πŸŒ²πŸ€ BLLAMA: A BLIP2 + ALPACA-LORA Pipeline

Training

This is just a pipeline involving the use of both ALPACA and BLIP-2, without any prior finetuning. You can refer to the details in ALPACA_LORA's repo here and the BLIP-2 training details on their GitHub page here. For the pipeline, I have used the BLIP-2 model found on HuggingSpace here

Acknowledgements

Once again, I would like to credit the Salesforce team for creating BLIP2, as well as tloen, the original creator of alpaca-lora. I would also like to credit Meta, the original creators of LLAMA, as well as the people behind the HuggingFace implementation of ALPACA