For those trying to shoe horn this large model on your machine every GB of saved memory counts when offloading to System RAM! Here is a pruned down the 22.2 Billion parameter model by 4 junk layers to make a 19B that doesnt appear to lose any sense of quality. https://huggingface.co/mistralai/Codestral-22B-v0.1