ppo-FrozenLake-v1 / results.json
DBusAI's picture
Retrain PPO model for FrozenLake-v1 v0
d807be1
raw
history blame contribute delete
135 Bytes
{"mean_reward": 0.8, "std_reward": 0.4, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-05-11T14:19:42.053572"}