ppo-LunarLander-v2 / results.json
Ye27's picture
Upload PPO Lunarlander testing agent.
659e1a7
raw
history blame contribute delete
165 Bytes
{"mean_reward": 209.49682650855092, "std_reward": 25.823449770023924, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-04-06T22:31:27.510728"}