ppo-MountainCar-v0 / config.yml
araffin's picture
Initial Commit
9f7c028 verified
raw
history blame contribute delete
280 Bytes
!!python/object/apply:collections.OrderedDict
- - - ent_coef
- 0.0
- - gae_lambda
- 0.98
- - gamma
- 0.99
- - n_envs
- 16
- - n_epochs
- 4
- - n_steps
- 16
- - n_timesteps
- 1000000.0
- - normalize
- true
- - policy
- MlpPolicy