sebaweis's picture
docs: specify details for falcon 7b full weights
79a866c
|
raw
history blame
891 Bytes
metadata
license: cc-by-nc-4.0
language:
  - en
tags:
  - text-generation
datasets:
  - stanford_alpaca
pipeline_tag: text-generation

This repo contains the full weights (8bit) for Falcon-7b fit on the Code Alpaca dataset.

Reproduction

This version of the weights was trained with the following hyperparameters:

  • Epochs: 6
  • Batch size: 128
  • Micro batch size: 8
  • Learning rate: 3e-4
  • Lora r: 16
  • Lora target modules: query_key_value

You can reproduce using this repository:

https://github.com/jina-ai/jerboa

Make sure you install requirements and finetune using this command using the following command:

python finetune.py \
    --base-model='tiiuae/falcon-7b' \
    --num-epochs=6 \
    --output-dir='./jinaai/falcon-7b' \
    --lora-target-modules=query_key_value \
    --lora-r=16 \
    --micro-batch-size=8