ppo-CarRacing-v0 / benchmarks
sgoodfriend's picture
PPO playing CarRacing-v0 from https://github.com/sgoodfriend/rl-algo-impls/tree/fbc943f151b95afc4905a67a3835fb6b18c6a5e4
85e4a43