alpaca
llama

This repository contains a LoRA checkpoint based on the Alpaca-LoRA implementation to perform coding tasks better. The provided checkpoint can be used with the generate.py script from the Alpaca-LoRA repository to generate text or perform other NLP tasks.

Setup

  1. Clone the Alpaca-LoRA repository:

    git clone https://github.com/tloen/alpaca-lora.git

  2. Install the required packages:

    pip install -r requirements.txt

Usage

Use the generate.py script from the Alpaca-LoRA repository to run the model with the provided LoRA checkpoint:

python generate.py --load_8bit --base_model 'decapoda-research/llama-7b-hf' --lora_weights 'vihangd/leet-coding-alpaca-lora-7b'

Troubleshooting

If you encounter an error like:

AttributeError: 'NoneType' object has no attribute 'device'

Modify generate.py as follows (see issue #21):

model = PeftModel.from_pretrained(
    model,
    lora_weights,
    torch_dtype=torch.float16,
    device_map={'': 0}
)
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Datasets used to train vihangd/leet-coding-alpaca-lora-7b