ppo-LunarLander-v2 / config.json

Commit History

Uploaded PPO trained with 1M steps
4135e8c

omiro commited on