LunarLander-PPO / results.json
ashrek's picture
uploading a PPO solution to lunar lander
529088d
raw
history blame
165 Bytes
{"mean_reward": 246.18017214131405, "std_reward": 24.245721541837344, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-01-15T17:50:59.204214"}