Update README.md
Browse files
README.md
CHANGED
@@ -2,7 +2,6 @@
|
|
2 |
datasets:
|
3 |
- USERNAME/QueryBridge
|
4 |
---
|
5 |
-
- This model is a fine-tuned version of llama3 using LoRA. We used TorchTune to fine-tune the model. Below, you will find a section on how we fine-tuned it.
|
6 |
|
7 |
# Model Overview
|
8 |
|
@@ -10,8 +9,6 @@ This model is a fine-tuned version of llama3 using the [QueryBridge dataset](htt
|
|
10 |
|
11 |
The tagged questions in the QueryBridge dataset are designed to train language models to understand the components and structure of a question effectively. By annotating questions with specific tags such as `<qt>`, `<p>`, `<o>`, and `<s>`, we provide a detailed breakdown of each question's elements, which aids the model in grasping the roles of different components.
|
12 |
|
13 |
-
For example, the video below demonstrates how a model can be trained to interpret these tagged questions. We convert these annotated questions into a graph representation, which visually maps out the relationships and roles within the question. This graph-based representation facilitates the construction of queries in various query languages such as SPARQL, SQL, Cypher, and others, by translating the structured understanding into executable query formats. This approach not only enhances the model’s ability to parse and generate queries across different languages but also ensures consistency and accuracy in query formulation.
|
14 |
-
|
15 |
<a href="https://youtu.be/J_N-6m8fHz0">
|
16 |
<img src="https://cdn-uploads.huggingface.co/production/uploads/664adb4a691370727c200af0/sDfp7DiYrGKvH58KdXOIY.png" alt="Training Model with Tagged Questions" width="400" height="300" />
|
17 |
</a>
|
@@ -33,9 +30,12 @@ The tagging system categorizes different components of the question as follows:
|
|
33 |
| `<ref>`| **References**: Tags in questions that refer back to previously mentioned entities or concepts. These can indicate cycles or self-references in queries. Example: In "Who is the CEO of the company founded by himself?", the word 'himself' is tagged as `<ref>himself</ref>`. |
|
34 |
|
35 |
|
36 |
-
|
|
|
37 |
To use the model, you can run it with TorchTune commands. I have provided the necessary Python code to automate the process. Follow these steps to get started:
|
38 |
|
|
|
|
|
39 |
### Step 1: Create a Configuration File
|
40 |
First, save a file named `custom_generation_config_bigModel.yaml` in `/home/USERNAME/` with the following content:
|
41 |
|
@@ -165,7 +165,7 @@ To run the script and generate tagged questions, execute the following command i
|
|
165 |
```bash
|
166 |
python command.py
|
167 |
```
|
168 |
-
|
169 |
|
170 |
|
171 |
## How we finetuned the model?
|
|
|
2 |
datasets:
|
3 |
- USERNAME/QueryBridge
|
4 |
---
|
|
|
5 |
|
6 |
# Model Overview
|
7 |
|
|
|
9 |
|
10 |
The tagged questions in the QueryBridge dataset are designed to train language models to understand the components and structure of a question effectively. By annotating questions with specific tags such as `<qt>`, `<p>`, `<o>`, and `<s>`, we provide a detailed breakdown of each question's elements, which aids the model in grasping the roles of different components.
|
11 |
|
|
|
|
|
12 |
<a href="https://youtu.be/J_N-6m8fHz0">
|
13 |
<img src="https://cdn-uploads.huggingface.co/production/uploads/664adb4a691370727c200af0/sDfp7DiYrGKvH58KdXOIY.png" alt="Training Model with Tagged Questions" width="400" height="300" />
|
14 |
</a>
|
|
|
30 |
| `<ref>`| **References**: Tags in questions that refer back to previously mentioned entities or concepts. These can indicate cycles or self-references in queries. Example: In "Who is the CEO of the company founded by himself?", the word 'himself' is tagged as `<ref>himself</ref>`. |
|
31 |
|
32 |
|
33 |
+
<details>
|
34 |
+
<summary>How to use the model?</summary>
|
35 |
To use the model, you can run it with TorchTune commands. I have provided the necessary Python code to automate the process. Follow these steps to get started:
|
36 |
|
37 |
+
- **Note:** Replace each `USERNAME` with your username.
|
38 |
+
|
39 |
### Step 1: Create a Configuration File
|
40 |
First, save a file named `custom_generation_config_bigModel.yaml` in `/home/USERNAME/` with the following content:
|
41 |
|
|
|
165 |
```bash
|
166 |
python command.py
|
167 |
```
|
168 |
+
</details>
|
169 |
|
170 |
|
171 |
## How we finetuned the model?
|