|
--- |
|
license: bigscience-bloom-rail-1.0 |
|
language: |
|
- ar |
|
library_name: transformers |
|
pipeline_tag: text-generation |
|
--- |
|
This huggingface page hosts a low-rank adapter designed specifically for the fine-tuning of the bloom-7b model on Arabic instructions. Additional information regarding the datasets will be made available soon. The model was trained using the codebase found in the repository: https://github.com/KhalidAlt/alpaca-lora/tree/hf_models. This work is based on this repository: https://github.com/tloen/alpaca-lora, with certain modifications to adjust the requirements of bloom-7b. |