ppo-LunarLander-v2 / results.json
jarkrandel's picture
PPO LLv2 model
1b782e1
raw
history blame contribute delete
165 Bytes
{"mean_reward": 265.50472492882557, "std_reward": 13.453559568600122, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-12-21T02:30:52.360847"}