File size: 543 Bytes
987a840 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 |
---
language: en
tags:
- llama-2
- lora
- ranking
license: apache-2.0
---
# bytestorm/SKIM-orcas-kdd25
This is a LoRA-tuned checkpoint of Llama-2-7b for ranking tasks.
## Model Details
- **Base Model:** Llama-2-7b
- **Training Type:** LoRA fine-tuning
- **Task:** Ranking/Retrieval
- **Framework:** PyTorch
## Usage
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("bytestorm/SKIM-orcas-kdd25")
tokenizer = AutoTokenizer.from_pretrained("meta-llama/Llama-2-7b-hf")
```
|