vanilla1116 commited on
Commit
4f6ee6a
·
verified ·
1 Parent(s): ed968bd

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -8
README.md CHANGED
@@ -9,18 +9,13 @@ license: apache-2.0
9
 
10
  This page holds the ANAH-v2 model which is trained base on the Internlm2-7B. It is fine-tuned to annotate the hallucination in LLM's responses.
11
 
12
- More information please refer to our [project page](https://open-compass.github.io/ANAH/).
13
 
14
  ## 🤗 How to use the model
15
 
16
- You have to follow the prompt in [our paper](https://arxiv.org/abs/2407.04693) to annotate the hallucination and you can find it easily [here](https://github.com/open-compass/ANAH/blob/main/prompt_v2.py).
17
 
18
- The models follow the conversation format of InternLM2-chat, with the template protocol as:
19
-
20
- ```python
21
- dict(role='user', begin='<|im_start|>user\n', end='<|im_end|>\n'),
22
- dict(role='assistant', begin='<|im_start|>assistant\n', end='<|im_end|>\n'),
23
- ```
24
 
25
  ## 🖊️ Citation
26
 
 
9
 
10
  This page holds the ANAH-v2 model which is trained base on the Internlm2-7B. It is fine-tuned to annotate the hallucination in LLM's responses.
11
 
12
+ More information please refer to our [project page](https://github.com/open-compass/ANAH).
13
 
14
  ## 🤗 How to use the model
15
 
16
+ You have to follow the prompt in [our paper](https://arxiv.org/abs/2407.04693) to annotate the hallucination and you can find it easily [here](https://github.com/open-compass/ANAH/blob/main/example/anahv2_prompt.py).
17
 
18
+ We also provide some [examples](https://github.com/open-compass/ANAH/blob/main/example) of using the ANAH-v2 annotator, which you can refer to for annotating your content.
 
 
 
 
 
19
 
20
  ## 🖊️ Citation
21