YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Just a version of https://huggingface.co/mistralai/Mistral-7B-v0.1 which is sharded to 2GB as to reduce the ram required when loading.

original_base_model: MistralForCausalLM = AutoModelForCausalLM.from_pretrained( pretrained_model_name_or_path="mistralai/Mistral-7B-v0.1", device_map="auto", torch_dtype=torch.float16, offload_folder="offload", trust_remote_code=True, low_cpu_mem_usage=True )

original_base_model.save_pretrained( save_directory="kkboy1/Mistral-7B-v0.1-sharded", max_shard_size="2GB", push_to_hub=True )

Downloads last month
7
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for kkboy1/Mistral-7B-v0.1-sharded

Finetunes
1 model