Thank you

#1
by davidsyoung - opened

I wanted to say thank you, as the group size of 128 on this quant allows me to run it on 16x3090 with 16k ctx, which is awesome.

Did you ever test 256?

Thanks for your feedback. I have not conveted and tested the version with group size of 256, you can convert it step by step: DeepSeek-R1(fp8) -> BF16 -> AWQ(AutoAWQ, group_size=256)

Let me konw if you have any questions about it.

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment