Datasets:

Modalities:
Text
Formats:
json
Languages:
English
ArXiv:
Libraries:
Datasets
pandas
wenhu commited on
Commit
76018fe
·
verified ·
1 Parent(s): 743a08e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +18 -1
README.md CHANGED
@@ -1,15 +1,20 @@
1
  ---
2
  language:
3
  - en
 
 
 
4
  ---
5
 
6
  # ScholarCopilot-Data-v1
7
 
8
-
9
  ScholarCopilot-Data-v1 contains the corpus data and embedded vectors of [Scholar Copilot](https://github.com/TIGER-AI-Lab/ScholarCopilot). Scholar Copilot improves the academic writing process by seamlessly integrating automatic text completion and intelligent citation suggestions into a cohesive, human-in-the-loop AI-driven pipeline. Designed to enhance productivity and creativity, it provides researchers with high-quality text generation and precise citation recommendations powered by iterative and context-aware Retrieval-Augmented Generation (RAG).
10
 
11
  The current version of Scholar Copilot leverages a state-of-the-art 7-billion-parameter language model (LLM) trained on the complete Arxiv full paper corpus. This unified model for retrieval and generation is adept at making context-sensitive decisions about when to cite, what to cite, and how to generate coherent content based on reference papers.
12
 
 
 
 
13
  ## 🌟 Key Features
14
 
15
  - ** 📝 Next-3-Sentence Suggestions: Facilitates writing by predicting the next sentences with automatic retrieval and citation of relevant reference papers.
@@ -17,3 +22,15 @@ The current version of Scholar Copilot leverages a state-of-the-art 7-billion-pa
17
  - ** ✨ Full Section Auto-Completion: Assists in brainstorming and drafting comprehensive paper content and structure.
18
 
19
  The current version of ScholarCopilot primarily focuses on the introduction and related work sections of academic papers. We will support full-paper writing in future releases.
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  language:
3
  - en
4
+ license: apache-2.0
5
+ task_categories:
6
+ - text-generation
7
  ---
8
 
9
  # ScholarCopilot-Data-v1
10
 
 
11
  ScholarCopilot-Data-v1 contains the corpus data and embedded vectors of [Scholar Copilot](https://github.com/TIGER-AI-Lab/ScholarCopilot). Scholar Copilot improves the academic writing process by seamlessly integrating automatic text completion and intelligent citation suggestions into a cohesive, human-in-the-loop AI-driven pipeline. Designed to enhance productivity and creativity, it provides researchers with high-quality text generation and precise citation recommendations powered by iterative and context-aware Retrieval-Augmented Generation (RAG).
12
 
13
  The current version of Scholar Copilot leverages a state-of-the-art 7-billion-parameter language model (LLM) trained on the complete Arxiv full paper corpus. This unified model for retrieval and generation is adept at making context-sensitive decisions about when to cite, what to cite, and how to generate coherent content based on reference papers.
14
 
15
+ Links: [paper](https://arxiv.org/abs/2504.00824) | [model](https://huggingface.co/TIGER-Lab/ScholarCopilot-v1) | [demo](https://huggingface.co/spaces/TIGER-Lab/ScholarCopilot)
16
+
17
+
18
  ## 🌟 Key Features
19
 
20
  - ** 📝 Next-3-Sentence Suggestions: Facilitates writing by predicting the next sentences with automatic retrieval and citation of relevant reference papers.
 
22
  - ** ✨ Full Section Auto-Completion: Assists in brainstorming and drafting comprehensive paper content and structure.
23
 
24
  The current version of ScholarCopilot primarily focuses on the introduction and related work sections of academic papers. We will support full-paper writing in future releases.
25
+
26
+ ## Citation
27
+
28
+ Please cite our paper with
29
+ ```
30
+ @article{wang2024scholarcopilot,
31
+ title={ScholarCopilot: Training Large Language Models for Academic Writing with Accurate Citations},
32
+ author = {Wang, Yubo and Ma, Xueguang and Nie, Ping and Zeng, Huaye and Lyu, Zhiheng and Zhang, Yuxuan and Schneider, Benjamin and Lu, Yi and Yue, Xiang and Chen, Wenhu},
33
+ journal={arXiv preprint arXiv:2504.00824},
34
+ year={2025}
35
+ }
36
+ ```