ppo-LunarLander / results.json
saeedHedayatian's picture
First commit
e8dafcf
raw
history blame
163 Bytes
{"mean_reward": 274.785835309866, "std_reward": 22.433743285560215, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-05-05T02:02:18.996757"}