PPO-LunarLander-v2 / results.json
recepdemrci's picture
demo
74d4ad1 verified
raw
history blame contribute delete
158 Bytes
{"mean_reward": 251.6517435, "std_reward": 11.974906574251014, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2024-07-06T16:25:23.844631"}