File size: 1,983 Bytes
203f237 04b7e0c 203f237 54867a9 28ab453 04b7e0c 54867a9 5672f7e 203f237 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 |
---
license: apache-2.0
language:
- multilingual
- af
- am
- ar
- az
- be
- bg
- bn
- ca
- ceb
- co
- cs
- cy
- da
- de
- el
- en
- eo
- es
- et
- eu
- fa
- fi
- fil
- fr
- fy
- ga
- gd
- gl
- gu
- ha
- haw
- hi
- hmn
- ht
- hu
- hy
- ig
- is
- it
- iw
- ja
- jv
- ka
- kk
- km
- kn
- ko
- ku
- ky
- la
- lb
- lo
- lt
- lv
- mg
- mi
- mk
- ml
- mn
- mr
- ms
- mt
- my
- ne
- nl
- no
- ny
- pa
- pl
- ps
- pt
- ro
- ru
- sd
- si
- sk
- sl
- sm
- sn
- so
- sq
- sr
- st
- su
- sv
- sw
- ta
- te
- tg
- th
- tr
- uk
- und
- ur
- uz
- vi
- xh
- yi
- yo
- zh
- zu
library_name: transformers
---
## Links for Reference
- **Repository: https://github.com/kaistAI/LangBridge**
- **Paper: [LangBridge: Multilingual Reasoning Without Multilingual Supervision](https://arxiv.org/pdf/2401.10695.pdf)**
- **Point of Contact: [email protected]**
# TL;DR
🤔LMs good at reasoning are mostly English-centric (MetaMath, Orca 2, etc).
😃Let’s adapt them to solve multilingual tasks. BUT without using multilingual data!
LangBridge “bridges” mT5 encoder and the target LM together while utilizing only English data. In test time, LangBridge models can solve multilingual reasoning tasks effectively.
![image/png](./figure2.png)
# Usage
Please refer to the [Github repository](https://github.com/kaistAI/LangBridge) for detailed usage examples.
# Related Models
[Check out other LangBridge models.](https://huggingface.co/collections/kaist-ai/langbridge-65afbbdae50627e40ca58f9a)
We have:
- Llama 2
- Llemma
- MetaMath
- Code Llama
- Orca 2
# Citation
If you find the following model helpful, please consider citing our paper!
**BibTeX:**
```bibtex
@misc{yoon2024langbridge,
title={LangBridge: Multilingual Reasoning Without Multilingual Supervision},
author={Dongkeun Yoon and Joel Jang and Sungdong Kim and Seungone Kim and Sheikh Shafayat and Minjoon Seo},
year={2024},
eprint={2401.10695},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |