PPO-LunarLander-v2 / results.json
Sami's picture
Upload model: PPO-LunarLander-v2, version: 6.000000
0b0401a
raw
history blame contribute delete
164 Bytes
{"mean_reward": 292.63209789655605, "std_reward": 17.52389699787343, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-05-04T19:09:56.890891"}