MathOctopus-MAPO-DPO-7B-GGUF / MathOctopus-MAPO-DPO-7B.Q2_K.gguf

Commit History

uploaded from rich1
fd49eb6
verified

mradermacher commited on