Hwanjun commited on
Commit
669ad08
·
verified ·
1 Parent(s): 072b546

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -1
README.md CHANGED
@@ -22,7 +22,7 @@ Surprisingly, it outperforms the nearly 10x larger Llama3-70B-Instruct while off
22
 
23
  Please refer to [our paper](link) to catch up how to exploit LLM-generated feedback in the context of text summarization.
24
 
25
- We also released a larger model, **SummLlama3-70B**. Please go to the [Huggingface link](https://huggingface.co/DISLab/SummLlama3-70B) for this model.
26
 
27
  ### *Recommended Prompt for Text Summarization:*
28
 
@@ -95,6 +95,8 @@ See an example how the summary improved by SummLlama3-8B over Llama3-8/70B-Instr
95
  | Llama3-70B-Instruct | Person 1 invites Person 2 to their house for Thanksgiving dinner on Thursday. Person 2 is grateful for the invitation and explains they can't go to Finland with their parents due to work commitments. Person 1 is happy to have them celebrate with their family, which will consist of 10 people. When Person 2 offers to bring something, Person 1 declines, but eventually agrees to let them bring a bottle of wine. |
96
  | SummLlama3-8B | The conversation is about Thanksgiving dinner arrangements, with Person2 (Paul) confirming his attendance at Person1's (host's) house on Thursday, discussing what to bring and what's already been taken care of, and finalizing the offer to bring wine instead of pie since someone else is handling desserts. |
97
 
 
 
98
  The summary of SummLlama3-8B can be considered a much human-preferred summary for the following reasons:
99
 
100
  **Core Focus:** The summary accurately captures the main theme of the conversation, which revolves around the Thanksgiving dinner arrangements. It highlights how the two people confirm plans, discuss what to bring, and finalize the decision for Person2 to bring wine instead of pie. This maintains the core context.
 
22
 
23
  Please refer to [our paper](link) to catch up how to exploit LLM-generated feedback in the context of text summarization.
24
 
25
+ We are plan to release its larger model, **SummLlama3-70B**. Please go to the [Huggingface link](https://huggingface.co/DISLab/SummLlama3-70B) for this model.
26
 
27
  ### *Recommended Prompt for Text Summarization:*
28
 
 
95
  | Llama3-70B-Instruct | Person 1 invites Person 2 to their house for Thanksgiving dinner on Thursday. Person 2 is grateful for the invitation and explains they can't go to Finland with their parents due to work commitments. Person 1 is happy to have them celebrate with their family, which will consist of 10 people. When Person 2 offers to bring something, Person 1 declines, but eventually agrees to let them bring a bottle of wine. |
96
  | SummLlama3-8B | The conversation is about Thanksgiving dinner arrangements, with Person2 (Paul) confirming his attendance at Person1's (host's) house on Thursday, discussing what to bring and what's already been taken care of, and finalizing the offer to bring wine instead of pie since someone else is handling desserts. |
97
 
98
+ ## Example
99
+
100
  The summary of SummLlama3-8B can be considered a much human-preferred summary for the following reasons:
101
 
102
  **Core Focus:** The summary accurately captures the main theme of the conversation, which revolves around the Thanksgiving dinner arrangements. It highlights how the two people confirm plans, discuss what to bring, and finalize the decision for Person2 to bring wine instead of pie. This maintains the core context.