PPO-LunarLander-v2 / results.json
oscarb92's picture
train PPO model for LunarLander-v2
dc6bc88
raw
history blame contribute delete
165 Bytes
{"mean_reward": 250.13381681323193, "std_reward": 42.089300911422235, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-01-12T10:16:41.246432"}