PPO-LunarLander-v2 / results.json
EvanMath's picture
Try No.1
b274cb5
raw
history blame contribute delete
163 Bytes
{"mean_reward": 228.4394443546874, "std_reward": 16.98335874099547, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-05-07T07:11:16.239015"}