ppo-LunarLander-v2 / results.json
meln1k's picture
2m steps
d201b00
raw
history blame
164 Bytes
{"mean_reward": 285.9203131038186, "std_reward": 20.128335678563207, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-05-10T01:24:32.110316"}