--- license: mit datasets: - nsarrazin/lichess-games-2023-01 pipeline_tag: text-generation tags: - chess --- A 231M parameter base model trained on 4.4B tokens of lichess games from January 2023 that ended in checkmate (filtered out games that were won because of time). ## Inference ```py from transformers import GPT2LMHeadModel, AutoTokenizer model = GPT2LMHeadModel.from_pretrained("nsarrazin/chessformer").eval() tokenizer = AutoTokenizer.from_pretrained("nsarrazin/chessformer") moves = " ".join(["e2e4", "e7e5", "d2d4", "d7d5"]) model_inputs = tokenizer(moves, return_tensors="pt") gen_tokens = model.generate(**model_inputs, max_new_tokens=1)[0] next_move = tokenizer.decode(gen_tokens[-1]) print(next_move) #d4e5 ``` ### End of game detection The model also has three special tokens for end game detection ``, `` and ``. This can be useful for implementing beam search strategies. ```py moves = " ".join(["f2f3", "e7e5", "g2g4", "d8h4"]) model_inputs = tokenizer(moves, return_tensors="pt") gen_tokens = model.generate(**model_inputs, max_new_tokens=1)[0] next_move = tokenizer.decode(gen_tokens[-1]) print(next_move) # ```