GraphGPT-7B-mix-all / README.md
Jiabin99's picture
Update README.md
58f88a0
|
raw
history blame
1.19 kB
---
language:
- en
metrics:
- accuracy
---
# GraphGPT
GraphGPT is a graph-oriented Large Language Model tuned by Graph Instruction Tuning paradigm.
## Model Details
GraphGPT is a graph-oriented Large Language Model tuned by Graph Instruction Tuning paradigm based on the [Vicuna-7B-v1.5 model](https://huggingface.co/lmsys/vicuna-7b-v1.5).
* Developed by: [Data Intelligence Lab](https://sites.google.com/view/chaoh/group-join-us)@HKU
* Model type: An auto-regressive language model based on the transformer architecture.
* Finetuned from model: [Vicuna-7B-v1.5 model](https://huggingface.co/lmsys/vicuna-7b-v1.5).
## Model Sources
* Repository: [https://github.com/HKUDS/GraphGPT](https://github.com/HKUDS/GraphGPT)
* Paper: []()
* Project: [https://graphgpt.github.io/](https://graphgpt.github.io/)
## Uses
This version of GraphGPT is tuned utilizing the mixing instruction data, which is able to handle both node classification and link prediction for different graph datasets.
## How to Get Started with the Model
* Command line interface: Plaese refer to [https://github.com/HKUDS/GraphGPT](https://github.com/HKUDS/GraphGPT) to evaluate our GraphGPT.
* Gradio demo is under development.