PPO-LunarLander-v2 / results.json
Emptier8126's picture
PPO-LunarLander-v2
df1d05a verified
raw
history blame
156 Bytes
{"mean_reward": 267.6326683, "std_reward": 19.6925686251785, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2024-02-18T14:02:13.232073"}