|
--- |
|
license: mit |
|
--- |
|
|
|
For an explanation of this project and the models trained for it, please see the [Report](Report/REPORT.md). |
|
|
|
The root folder contains scripts for dataset preprocessing. |
|
|
|
[chess-mamba-vs-xformer](../../tree/main/chess-mamba-vs-xformer/) contains the training scripts. |
|
|
|
Config files, used to set model configuration and training hyperameters, are in [chess-mamba-vs-xformer/config](../../tree/main/chess-mamba-vs-xformer/config). |
|
|
|
Model checkpoints are in [chess-mamba-vs-xformer/out](../../tree/main/chess-mamba-vs-xformer/out). The last checkpoint for completed models (e.g. Mamba and Transformer 50M) are .../anneal/anneal_complete.pt. |
|
|
|
[chess-gpt-eval](../../tree/main/chess-gpt-eval/) has the scripts for model evaluation - playings games against Stockfish or lc0 chess engines. The logs folder contains raw evaluation metrics. |
|
|
|
[chess-gpt-eval-contrastive](../../tree/main/chess-gpt-eval-contrastive/) likewise has the scripts for model evaluation, but modified for training and evaluation of contrastive activation and linear probes. The logs folder again contains raw evaluation metrics. |