LeTI: Learning to Generate from Textual Interactions

Official model checkpoint for paper LeTI: Learning to Generate from Textual Interactions.

The code associated with these checkpoints can be found here.

Decompress the model checkpoint data.tar.gz (using tar -xzvf data.tar.gz) will create a folder data. After decompress the checkpoints, you can set export GS_BUCKET_PREFIX=pwd and run evaluation of your choice following this.

We currently release the following checkpoints in the paper:

  • LeTI (350M): mbpp-ft/actor-rw-conditioned/codegen-350M-mono/350M+rw_conditioned+mixpretrain+50x3+lr1e-5/checkpoint_31416
  • LeTI (2B): mbpp-ft/actor-rw-conditioned/codegen-2B-mono/2B+rw_conditioned+mixpretrain+10x3+lr5e-6/checkpoint_13465
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.