ppo-LunarLander-v2 / results.json
RandenBanuelos's picture
Upload PPO LunarLander-v2 model
6ac01c0
raw
history blame contribute delete
165 Bytes
{"mean_reward": 259.86118109342135, "std_reward": 18.116557483919696, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-04-05T20:06:33.761200"}