bat-pre-trained / README.md
elena-soare's picture
Update README.md
521e3ea
|
raw
history blame
485 Bytes
# Text2SQL Task T5-Base + E-commerce pre-training
This is our T5 model pre-trained on 18k e-commerce pages from popular blogs and fine-tuned on Spider using a schema serialization.
## Running the model
Inspired by the work done by [Picard](https://github.com/ElementAI/picard/) by adding a pre-training step for better performance on e-commerce data.
```python
[question] | [db_id] | [table] : [column] ( [content] , [content] ) , [column] ( ... ) , [...] | [table] : ... | ...
```