Salesforce/codegen2-1B

This is the Salesforce/codegen2-1B model converted to OpenVINO, for accelerated inference.

An example of how to do inference on this model:

from transformers import AutoTokenizer
from optimum.intel.openvino import OVModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("helenai/Salesforce-codegen2-1B-ov")

model = OVModelForCausalLM.from_pretrained("helenai/Salesforce-codegen2-1B-ov")

# Try the version with quantized model weights by changing the line above to:
# model = OVModelForCausalLM.from_pretrained("helenai/Salesforce-codegen2-1B-ov", revision="compressed_weights")

text = "def hello_world():"
input_ids = tokenizer(text, return_tensors="pt").input_ids
generated_ids = model.generate(input_ids, max_length=128)
print(tokenizer.decode(generated_ids[0], skip_special_tokens=True))
Downloads last month
21
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.