ppo-LunarLander-v2 / results.json
meln1k's picture
10m steps
c6e43b5
raw
history blame
164 Bytes
{"mean_reward": 289.2563382642683, "std_reward": 18.329229668436163, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-05-10T01:33:50.244956"}