Commit History

Increase the num_epochs of PPO to 8
9d105c0

Jialin Yi commited on

Upload PPO LunarLander-v2 trained agent
1ef56c8

Jialin Yi commited on

initial commit
ac6f5f7

anatta-jyi commited on