|
# nl2bash-custom
|
|
|
|
nl2bash-custom is a custom dataset used to fine-tune Large Language Models for Bash Code Generation. Fine tune the Code-Llamma family of LLMs (7b, 13b, 70b) for best results.
|
|
|
|
The dataset is created by reformatting and reshiffling of 2 original datasets
|
|
- [nl2bash by TelinaTool](https://github.com/TellinaTool/nl2bash)
|
|
- [NLC2CMD by Magnum Reasearch Group](https://github.com/magnumresearchgroup/Magnum-NLC2CMD)
|
|
|
|
## Dataset Structure
|
|
|
|
- `train.json`: Training split.
|
|
- `dev.json`: Development split.
|
|
- `test.json`: Test split.
|
|
|
|
## Usage
|
|
|
|
```python
|
|
from datasets import load_dataset
|
|
|
|
dataset = load_dataset("AnishJoshi/nl2bash-custom")
|
|
```
|
|
|
|
## Features
|
|
|
|
- 'srno': Serial number of the input-output pair
|
|
- 'nl_command': The natural language input/command
|
|
- 'bash_code': Corresponding bash code
|
|
|
|
## References
|
|
|
|
@inproceedings{LinWZE2018:NL2Bash,
|
|
author = {Xi Victoria Lin and Chenglong Wang and Luke Zettlemoyer and Michael D. Ernst},
|
|
title = {NL2Bash: A Corpus and Semantic Parser for Natural Language Interface to the Linux Operating System},
|
|
booktitle = {Proceedings of the Eleventh International Conference on Language Resources
|
|
and Evaluation {LREC} 2018, Miyazaki (Japan), 7-12 May, 2018.},
|
|
year = {2018}
|
|
}
|
|
|
|
@article{Fu2021ATransform,
|
|
title={A Transformer-based Approach for Translating Natural Language to Bash Commands},
|
|
author={Quchen Fu and Zhongwei Teng and Jules White and Douglas C. Schmidt},
|
|
journal={2021 20th IEEE International Conference on Machine Learning and Applications (ICMLA)},
|
|
year={2021},
|
|
pages={1241-1244}
|
|
}
|
|
|
|
@article{fu2023nl2cmd,
|
|
title={NL2CMD: An Updated Workflow for Natural Language to Bash Commands Translation},
|
|
author={Fu, Quchen and Teng, Zhongwei and Georgaklis, Marco and White, Jules and Schmidt, Douglas C},
|
|
journal={Journal of Machine Learning Theory, Applications and Practice},
|
|
pages={45--82},
|
|
year={2023}
|
|
}
|
|
|