|
# DataBack: Dataset of SAT Formulas and Backbone Variable Phases |
|
|
|
## What is DataBack |
|
`DataBack` is a dataset that consists of SAT formulas in CNF format, each labeled with the phases of its backbone variables. Within `DataBack`, there are two distinct subsets: the pre-training set, named `DataBack-PT`, and the fine-tuning set, named `DataBack-FT`. The state-of-the-art backbone extractor called [`CadiBack`](https://wenxiwang.github.io/papers/cadiback.pdf) has been employed to obtain the backbone variable phases. We have allocated a timeout of 1,000 seconds for the pre-training formulas and 5,000 seconds for the fine-tuning ones due to the increased complexity of the fine-tuning formulas. |
|
|
|
The `DataBack` dataset has been employed to both pre-train and fine-tune our `NeuroBack` model, which has demonstrated significant improvements in SAT solving efficiency. For an in-depth exploration of `DataBack`, please refer to our [`NeuroBack`](https://arxiv.org/pdf/2110.14053.pdf) or [`CadiBack`](https://wenxiwang.github.io/papers/cadiback.pdf) papers. |
|
|
|
## Contributors |
|
Wenxi Wang ([email protected]), Yang Hu ([email protected]) |
|
|
|
## References |
|
If you use `DataBack` in your research, please kindly cite the following papers. |
|
|
|
[`NeuroBack`](https://arxiv.org/pdf/2110.14053.pdf) paper: |
|
```bib |
|
@article{wang2023neuroback, |
|
author = {Wang, Wenxi and |
|
Hu, Yang and |
|
Tiwari, Mohit and |
|
Khurshid, Sarfraz and |
|
McMillan, Kenneth L. and |
|
Miikkulainen, Risto}, |
|
title = {NeuroBack: Improving CDCL SAT Solving using Graph Neural Networks}, |
|
journal={arXiv preprint arXiv:2110.14053}, |
|
year={2021} |
|
} |
|
``` |
|
|
|
[`CadiBack`](https://wenxiwang.github.io/papers/cadiback.pdf) paper: |
|
```bib |
|
@inproceedings{biere2023cadiback, |
|
title={CadiBack: Extracting Backbones with CaDiCaL}, |
|
author={Biere, Armin and Froleyks, Nils and Wang, Wenxi}, |
|
booktitle={26th International Conference on Theory and Applications of Satisfiability Testing (SAT 2023)}, |
|
year={2023}, |
|
organization={Schloss Dagstuhl-Leibniz-Zentrum f{\"u}r Informatik} |
|
} |
|
``` |
|
|
|
## Directory Structure |
|
``` |
|
|- original # Original SAT formulas and their backbone variable phases |
|
| |- cnf_pt.tar.gz # SAT formulas (in CNF format) for model pre-training |
|
| |- bb_pt.tar.gz # Backbone phases for pre-training formulas |
|
| |- cnf_ft.tar.gz # SAT formulas (in CNF format) for model fine-tuning |
|
| |- bb_ft.tar.gz # Backbone phases for fine-tuning formulas |
|
| |
|
|- dual # Dual SAT formulas and their backbone variable phases |
|
| |- dual_cnf_pt.tar.gz # Dual SAT formulas (in CNF format) for model pre-training |
|
| |- dual_bb_pt.tar.gz # Backbone phases for dual pre-training CNFs |
|
| |- dual_cnf_ft.tar.gz # Dual SAT formulas (in CNF format) for model fine-tuning |
|
| |- dual_bb_ft.tar.gz # Backbone phases for dual fine-tuning CNFs |
|
``` |
|
|
|
## File Naming Convention |
|
In the original directory, each CNF tar file (`cnf_*.tar.gz`) contains compressed CNF files named: `[cnf_name].[compression_format]`, where `[compression_format]` could be bz2, lzma, xz, gz, etc. Correspondingly, each backbone tar file (`bb_*.tar.gz`) comprises compressed backbone files named: `[cnf_name].backbone.xz`. It is important to note that a compressed CNF file will always share its `[cnf_name]` with its associated compressed backbone file. |
|
|
|
In the dual directory, the naming convention remains consistent, but with an added `dual_` prefix for each CNF tar file or backbone tar file, and an `d_` prefix for each compressed CNF or backbone file to indicate that it pertains to a dual SAT formula. |
|
|
|
## Format of the Extracted Backbone File |
|
The extracted backbone file (`*.backbone`) adheres to the output format of [`CadiBack`](https://wenxiwang.github.io/papers/cadiback.pdf). Specifically, each line contains the letter 'b' followed by an integer. This integer represents the backbone variable ID and its associated phase. |
|
|