PPO-LunarLander-v2 / results.json
dalvarez's picture
Hi RL
b7edba6
raw
history blame contribute delete
162 Bytes
{"mean_reward": 131.3187926655868, "std_reward": 54.4171881820917, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-05-05T00:12:13.416749"}