|
--- |
|
library_name: peft |
|
license: mit |
|
datasets: |
|
- multi_nli |
|
- snli |
|
language: |
|
- en |
|
metrics: |
|
- spearmanr |
|
--- |
|
|
|
|
|
# AnglE๐: Angle-optimized Text Embeddings |
|
|
|
> It is Angle ๐, not Angel ๐ผ. |
|
|
|
๐ฅ A New SOTA Model for Semantic Textual Similarity! |
|
|
|
Github: https://github.com/SeanLee97/AnglE |
|
|
|
<a href="https://arxiv.org/abs/2309.12871"> |
|
<img src="https://img.shields.io/badge/Arxiv-2306.06843-yellow.svg?style=flat-square" alt="https://arxiv.org/abs/2309.12871" /> |
|
</a> |
|
|
|
[](https://paperswithcode.com/sota/semantic-textual-similarity-on-sick-r-1?p=angle-optimized-text-embeddings) |
|
[](https://paperswithcode.com/sota/semantic-textual-similarity-on-sts16?p=angle-optimized-text-embeddings) |
|
[](https://paperswithcode.com/sota/semantic-textual-similarity-on-sts15?p=angle-optimized-text-embeddings) |
|
[](https://paperswithcode.com/sota/semantic-textual-similarity-on-sts14?p=angle-optimized-text-embeddings) |
|
[](https://paperswithcode.com/sota/semantic-textual-similarity-on-sts13?p=angle-optimized-text-embeddings) |
|
[](https://paperswithcode.com/sota/semantic-textual-similarity-on-sts12?p=angle-optimized-text-embeddings) |
|
[](https://paperswithcode.com/sota/semantic-textual-similarity-on-sts-benchmark?p=angle-optimized-text-embeddings) |
|
|
|
|
|
**STS Results** |
|
|
|
|
|
| Model | STS12 | STS13 | STS14 | STS15 | STS16 | STSBenchmark | SICKRelatedness | Avg. | |
|
| ------- |-------|-------|-------|-------|-------|--------------|-----------------|-------| |
|
| [SeanLee97/angle-llama-7b-nli-20231027](https://huggingface.co/SeanLee97/angle-llama-7b-nli-20231027) | 78.68 | 90.58 | 85.49 | 89.56 | 86.91 | 88.92 | 81.18 | 85.90 | |
|
| [SeanLee97/angle-llama-7b-nli-v2](https://huggingface.co/SeanLee97/angle-llama-7b-nli-v2) | 79.00 | 90.56 | 85.79 | 89.43 | 87.00 | 88.97 | 80.94 | 85.96 | |
|
| [SeanLee97/angle-llama-13b-nli](https://huggingface.co/SeanLee97/angle-llama-13b-nli) | 79.33 | 90.65 | 86.89 | 90.45 | 87.32 | 89.69 | 81.32 | **86.52** | |
|
|
|
|
|
|
|
## Usage |
|
|
|
```bash |
|
python -m pip install -U angle-emb |
|
``` |
|
|
|
```python |
|
from angle_emb import AnglE, Prompts |
|
|
|
# init |
|
angle = AnglE.from_pretrained('NousResearch/Llama-2-13b-hf', pretrained_lora_path='SeanLee97/angle-llama-13b-nli', load_kbit=16, apply_bfloat16=False) |
|
|
|
# set prompt |
|
print('All predefined prompts:', Prompts.list_prompts()) |
|
angle.set_prompt(prompt=Prompts.A) |
|
print('prompt:', angle.prompt) |
|
|
|
# encode text |
|
vec = angle.encode({'text': 'hello world'}, to_numpy=True) |
|
print(vec) |
|
vecs = angle.encode([{'text': 'hello world1'}, {'text': 'hello world2'}], to_numpy=True) |
|
print(vecs) |
|
``` |
|
|
|
## Citation |
|
|
|
You are welcome to use our code and pre-trained models. If you use our code and pre-trained models, please support us by citing our work as follows: |
|
|
|
```bibtex |
|
@article{li2023angle, |
|
title={AnglE-Optimized Text Embeddings}, |
|
author={Li, Xianming and Li, Jing}, |
|
journal={arXiv preprint arXiv:2309.12871}, |
|
year={2023} |
|
} |
|
``` |