ppo-LunarLander-v2 / results.json
Danielmartinez's picture
first_attempt
cd8c6c3
raw
history blame contribute delete
163 Bytes
{"mean_reward": 260.9779023452888, "std_reward": 15.01876639958356, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-12-07T03:15:21.112308"}