(づ ̄3 ̄)づ╭❤️~
#5
by
xxx777xxxASD
- opened
You and your models are awesome. Are you planning to fine tune something between 8B and 70B? Maybe a 4x8B? It seems like many people want it but no one except a few guys even tried to make and finetune one. Have no idea what's in your dataset, but your cooking is something else
Well, you need a good foundational model to do that with -- while it'd be cool to see Sao tackle a Mixtral or a Command-R, models of that size range just tend to be a bit more niche.
(that being said m8 if you did wanna do a Mixtral or a Command-R that'd be pretty sick just sayin')
(EDIT: I'm dumb. Sao's done a couple Mixtrals.)
maybe gemma 27b