Update README.md
Browse files
README.md
CHANGED
@@ -44,10 +44,6 @@ The LongForm dataset is created by leveraging English corpus examples with augme
|
|
44 |
|
45 |
Github Repo: https://github.com/akoksal/LongForm
|
46 |
|
47 |
-
LongForm-**OPT-2.7B**: https://huggingface.co/akoksal/LongForm-OPT-2.7B
|
48 |
-
|
49 |
-
LongForm-**OPT-6.7B**: https://huggingface.co/akoksal/LongForm-OPT-6.7B
|
50 |
-
|
51 |
## How to Load
|
52 |
```python
|
53 |
import torch
|
@@ -81,11 +77,19 @@ We provide in-depth evaluation of LongForm models and baselines in the paper. We
|
|
81 |
| [**LongForm-OPT-6.7B**](https://huggingface.co/akoksal/LongForm-OPT-6.7B) | 17.7 | 16.9 | 17.2 | 19.0 |
|
82 |
| [**LongForm-LLaMA-7B**](https://huggingface.co/akoksal/LongForm-LLaMA-7B-diff)‡ | **19.7** | **21.7** | **18.6** | 18.9 |
|
83 |
|
|
|
|
|
|
|
|
|
|
|
84 |
‡: We can just release the difference between LongForm-LLaMA-7B and pretrained LLaMA-7B publicly due to restrictions of LLaMA models.
|
85 |
|
86 |
## Limitations
|
87 |
The LongForm dataset and models mainly focus on long text generation and have limitations regarding structured prediction tasks in NLP. Additionally, we observe that LongForm models may present hallucination problems similar to those found in LLMs.
|
88 |
|
|
|
|
|
|
|
89 |
|
90 |
## Citation
|
91 |
```
|
|
|
44 |
|
45 |
Github Repo: https://github.com/akoksal/LongForm
|
46 |
|
|
|
|
|
|
|
|
|
47 |
## How to Load
|
48 |
```python
|
49 |
import torch
|
|
|
77 |
| [**LongForm-OPT-6.7B**](https://huggingface.co/akoksal/LongForm-OPT-6.7B) | 17.7 | 16.9 | 17.2 | 19.0 |
|
78 |
| [**LongForm-LLaMA-7B**](https://huggingface.co/akoksal/LongForm-LLaMA-7B-diff)‡ | **19.7** | **21.7** | **18.6** | 18.9 |
|
79 |
|
80 |
+
Smaller versions of LongForm-OPT models are also available:
|
81 |
+
- [**LongForm-OPT-1.3B**](https://huggingface.co/akoksal/LongForm-OPT-1.3B)
|
82 |
+
- [**LongForm-OPT-350M**](https://huggingface.co/akoksal/LongForm-OPT-350M)
|
83 |
+
- [**LongForm-OPT-125M**](https://huggingface.co/akoksal/LongForm-OPT-125M)
|
84 |
+
|
85 |
‡: We can just release the difference between LongForm-LLaMA-7B and pretrained LLaMA-7B publicly due to restrictions of LLaMA models.
|
86 |
|
87 |
## Limitations
|
88 |
The LongForm dataset and models mainly focus on long text generation and have limitations regarding structured prediction tasks in NLP. Additionally, we observe that LongForm models may present hallucination problems similar to those found in LLMs.
|
89 |
|
90 |
+
## License
|
91 |
+
The LongForm project is subject to a MIT License with custom limitations for restrictions imposed by OpenAI (for the instruction generation part), as well as the license of language models (OPT, LLaMA, and T5).
|
92 |
+
|
93 |
|
94 |
## Citation
|
95 |
```
|