Anyone able to deploy an inference endpoint on sagemaker?

#71
by TeoGX - opened

Unable to deploy an inference endpoint on sagemaker with suggested script

Hub Model configuration. https://huggingface.co/models

hub = {
'HF_MODEL_ID':'Qwen/Qwen2-VL-7B-Instruct',
'SM_NUM_GPUS': json.dumps(1)
}

create Hugging Face Model Class

huggingface_model = HuggingFaceModel(
image_uri=get_huggingface_llm_image_uri("huggingface",version="2.3.1"),
env=hub,
role=role,
)

deploy model to SageMaker Inference

predictor = huggingface_model.deploy(
initial_instance_count=1,
instance_type="ml.g5.8xlarge",
container_startup_health_check_timeout=300,
)

Sign up or log in to comment