ppo-MountainCar-v0 / environment.yml
sgoodfriend's picture
PPO playing MountainCar-v0 from https://github.com/sgoodfriend/rl-algo-impls/tree/5598ebc4b03054f16eebe76792486ba7bcacfc5c
68e589c
raw
history blame
221 Bytes
name: rl_algo_impls
channels:
- pytorch
- conda-forge
- nodefaults
dependencies:
- python=3.10.*
- mamba
- pip
- poetry
- pytorch
- torchvision
- torchaudio
- cmake
- swig
- ipywidgets
- black