--- language: - en --- # O->ConBART document simplification system This is a pretrained version of the document simplification model presented in the Findings of ACL 2023 paper ["Context-Aware Document Simplification"](https://arxiv.org/abs/2305.06274). It is a system based on a modification to the BART architecture and operates on individual sentences. It is intended to be guided by a document-level simplification planner. Target reading levels (1-4) should be indicated via a control token prepended to each input sequence ("\", "\", "\", "\"). If using the terminal interface, this will be handled automatically. ## How to use It is recommended to use the [plan_simp](https://github.com/liamcripwell/plan_simp/tree/main) library to interface with the model. Here is how to use this model in PyTorch: ```python from plan_simp.models.bart import load_simplifier simplifier, tokenizer, hparams = load_simplifier("liamcripwell/o-conbart") # dynamic plan-guided generation from plan_simp.scripts.generate import Launcher launcher = Launcher() launcher.dynamic(model_ckpt="liamcripwell/o-conbart", clf_model_ckpt="liamcripwell/pgdyn-plan", **params) ``` Generation and evaluation can also be run from the terminal. ```bash python plan_simp/scripts/generate.py dynamic --clf_model_ckpt=liamcripwell/pgdyn-plan --model_ckpt=liamcripwell/o-conbart --test_file= --doc_id_col=pair_id --context_dir= --reading_lvl=s_level --context_doc_id=c_id --out_file= python plan_simp/scripts/eval_simp.py --input_data=newselaauto_docs_test.csv --output_data=test_out_oconbart.csv --x_col=complex_str --r_col=simple_str --y_col=pred --doc_id_col=pair_id --prepro=True --sent_level=True ```