a2c-PandaReachDense-v2 / results.json
danieliser's picture
Initial commit
eaea68e
raw
history blame contribute delete
167 Bytes
{"mean_reward": -0.7458698012400419, "std_reward": 0.18083845133138932, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-05-30T01:21:18.507762"}