Upload folder using huggingface_hub
Browse files- README.md +3 -0
- ppo_1_epoch.pth +3 -0
README.md
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
# PPO Agent
|
2 |
+
|
3 |
+
This is a PPO agent trained for 1 epoch.
|
ppo_1_epoch.pth
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:e4732fd8567900ce569629f448403cd67bfd72c66d39ac357082068871b8ead3
|
3 |
+
size 2343347
|