ppo-LunarLander-v2 / results.json
rmn0ff's picture
set up ppo baseline
4cfb6a5
raw
history blame
163 Bytes
{"mean_reward": 164.1464231207277, "std_reward": 88.84532859406491, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-05-07T15:04:04.689683"}