INFERENTIA ONLY
from optimum.neuron import NeuronCLIPModel
input_shapes = {"text_batch_size": 2, "sequence_length": 77, "image_batch_size": 1, "num_channels": 3, "width": 224, "height": 224}
compiler_args = {"auto_cast": "matmul", "auto_cast_type": "bf16"}
neuron_model = NeuronCLIPModel.from_pretrained(
"openai/clip-vit-base-patch32",
export=True,
**input_shapes,
**compiler_args,
)
# Save locally
neuron_model.save_pretrained("clip_feature_extraction_neuronx/")
# Upload to the HuggingFace Hub
neuron_model.push_to_hub(
"clip_feature_extraction_neuronx/", repository_id="optimum/clip-vit-base-patch32-neuronx" # Replace with your HF Hub repo id
)
- Downloads last month
- 2
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model has no library tag.