rail-berkeley commited on
Commit
a440bb5
1 Parent(s): 02bd988

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -1
README.md CHANGED
@@ -1,3 +1,7 @@
 
 
 
 
1
  # Octo Small
2
  This model is trained with a window size of 2, predicting 7-dimensional actions 4 steps into the future using a diffusion policy. The model is a Transformer with 27M parameters (equivalent to a ViT-S). Images are tokenized by preprocessing with a lightweight convolutional encoder, then grouped into 16x16 patches. Language is tokenized by applying the T5 tokenizer, and then applying the T5-Base language encoder.
3
 
@@ -27,7 +31,7 @@ Tasks:
27
  At inference, you may pass in any subset of these observation and task keys, with a history window up to 2 timesteps.
28
 
29
 
30
- This model was trained on a mix of datasets from the Open X-Embodiment dataset
31
 
32
  | Dataset | Proportion of batch |
33
  |------------------------------------------------------------|---------------------|
 
1
+ ---
2
+ license: mit
3
+ pipeline_tag: robotics
4
+ ---
5
  # Octo Small
6
  This model is trained with a window size of 2, predicting 7-dimensional actions 4 steps into the future using a diffusion policy. The model is a Transformer with 27M parameters (equivalent to a ViT-S). Images are tokenized by preprocessing with a lightweight convolutional encoder, then grouped into 16x16 patches. Language is tokenized by applying the T5 tokenizer, and then applying the T5-Base language encoder.
7
 
 
31
  At inference, you may pass in any subset of these observation and task keys, with a history window up to 2 timesteps.
32
 
33
 
34
+ This model was trained on a mix of datasets from the Open X-Embodiment dataset.
35
 
36
  | Dataset | Proportion of batch |
37
  |------------------------------------------------------------|---------------------|