jiminHuang
commited on
Commit
•
58d4754
1
Parent(s):
09b4141
Update README.md
Browse files
README.md
CHANGED
@@ -1,8 +1,73 @@
|
|
1 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
2 |
license: mit
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
3 |
---
|
4 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
5 |
from transformers import LlamaTokenizer, LlamaForCausalLM
|
6 |
-
|
7 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
8 |
```
|
|
|
1 |
---
|
2 |
+
datasets:
|
3 |
+
- chancefocus/pixiu
|
4 |
+
- ChanceFocus/FLUPE
|
5 |
+
language:
|
6 |
+
- en
|
7 |
+
inference: false
|
8 |
license: mit
|
9 |
+
metrics:
|
10 |
+
- accuracy
|
11 |
+
- exact_match
|
12 |
+
- f1
|
13 |
+
library_name: transformers
|
14 |
+
tags:
|
15 |
+
- finance
|
16 |
+
- llama
|
17 |
+
- llms
|
18 |
+
|
19 |
---
|
20 |
+
|
21 |
+
# FinMA-7B-NLP
|
22 |
+
|
23 |
+
FinMA-7B-NLP is a financial large language model (LLM) developed as part of the [PIXIU project](https://github.com/chancefocus/PIXIU). It is designed to understand complex financial language and concepts, and is fine-tuned to follow natural language instructions, enhancing its performance in downstream financial tasks. Specifically, FinMA-7B-NLP is trained only on the NLP tasks of the PIXIU dataset, making it specialized for tasks such as sentiment analysis, news headline classification, named entity recognition, and question answering.
|
24 |
+
|
25 |
+
## Other Models in the PIXIU Project
|
26 |
+
|
27 |
+
In addition to FinMA-7B-NLP, the PIXIU project includes two other models: FinMA-7B-full and FinMA-30B.
|
28 |
+
|
29 |
+
- **FinMA-7B-full**: This model is trained with the full instruction data from the PIXIU dataset, covering both NLP and prediction tasks. This makes it a more comprehensive model capable of handling a wider range of financial tasks.
|
30 |
+
|
31 |
+
- **FinMA-30B**: This model is a larger version of FinMA, fine-tuned on the LLaMA-30B model. Like FinMA-7B-NLP, it is trained with the NLP instruction data.
|
32 |
+
|
33 |
+
## Usage
|
34 |
+
|
35 |
+
You can use the FinMA-7B-NLP model in your Python project with the Hugging Face Transformers library. Here is a simple example of how to load the model:
|
36 |
+
|
37 |
+
```python
|
38 |
from transformers import LlamaTokenizer, LlamaForCausalLM
|
39 |
+
|
40 |
+
tokenizer = LlamaTokenizer.from_pretrained('ChanceFocus/finma-7b-nlp')
|
41 |
+
model = LlamaForCausalLM.from_pretrained('ChanceFocus/finma-7b-nlp', device_map='auto')
|
42 |
+
```
|
43 |
+
|
44 |
+
In this example, LlamaTokenizer is used to load the tokenizer, and LlamaForCausalLM is used to load the model. The `device_map='auto'` argument is used to automatically use the GPU if it's available.
|
45 |
+
|
46 |
+
## Hosted Inference API
|
47 |
+
|
48 |
+
You can also use the model through the Hugging Face Inference API. This allows you to generate text without having to set up your own inference environment. The model can be loaded on the Inference API on-demand.
|
49 |
+
|
50 |
+
## License
|
51 |
+
|
52 |
+
FinMA-7B-NLP is licensed under MIT. For more details, please see the MIT file.
|
53 |
+
|
54 |
+
## About
|
55 |
+
|
56 |
+
This model is part of the PIXIU project, an open-source resource featuring the first financial large language models (LLMs), instruction tuning data, and evaluation benchmarks to holistically assess financial LLMs. The goal is to continually push forward the open-source development of financial artificial intelligence (AI).
|
57 |
+
|
58 |
+
For more information, you can visit the [PIXIU](https://github.com/chancefocus/PIXIU) project on GitHub.
|
59 |
+
|
60 |
+
## Citation
|
61 |
+
|
62 |
+
If you use FinMA-7B-NLP in your work, please cite the PIXIU paper:
|
63 |
+
|
64 |
+
```bibtex
|
65 |
+
@misc{xie2023pixiu,
|
66 |
+
title={PIXIU: A Large Language Model, Instruction Data and Evaluation Benchmark for Finance},
|
67 |
+
author={Qianqian Xie and Weiguang Han and Xiao Zhang and Yanzhao Lai and Min Peng and Alejandro Lopez-Lira and Jimin Huang},
|
68 |
+
year={2023},
|
69 |
+
eprint={2306.05443},
|
70 |
+
archivePrefix={arXiv},
|
71 |
+
primaryClass={cs.CL}
|
72 |
+
}
|
73 |
```
|