HaoyiZhu commited on
Commit
7026c7f
1 Parent(s): 4e0039c

Initial commit

Browse files
Files changed (1) hide show
  1. README.md +49 -3
README.md CHANGED
@@ -1,3 +1,49 @@
1
- ---
2
- license: mit
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ tags:
4
+ - embodied-ai
5
+ - representation-learning
6
+ - spatial awareness
7
+ - spatial intelligence
8
+ ---
9
+
10
+ # Model Card for SPA: 3D Spatial-Awareness Enables Effective Embodied Representation
11
+
12
+ <!-- Provide a quick summary of what the model is/does. -->
13
+
14
+ Pre-trained checkpoints of [SPA](https://haoyizhu.github.io/spa/).
15
+
16
+ SPA is a novel representation learning framework that emphasizes the importance of 3D spatial awareness in embodied AI.
17
+ It leverages differentiable neural rendering on multi-view images to endow a vanilla Vision Transformer (ViT) with
18
+ intrinsic spatial understanding. We also present the most comprehensive evaluation of embodied representation learning to date,
19
+ covering 268 tasks across 8 simulators with diverse policies in both single-task and language-conditioned multi-task scenarios.
20
+
21
+ ## Model Details
22
+
23
+ ### Model Description
24
+
25
+ <!-- Provide a longer summary of what this model is. -->
26
+
27
+
28
+
29
+ - **Developed by:** [Haoyi Zhu](https://www.haoyizhu.site/)
30
+ - **Model type:** Embodied AI Representation Learning
31
+ - **Encoder (Backbone) type:** Vision Transformer (ViT)
32
+
33
+ ### Model Sources [optional]
34
+
35
+ <!-- Provide the basic links for the model. -->
36
+
37
+ - **Repository:** [https://github.com/HaoyiZhu/SPA](https://github.com/HaoyiZhu/SPA)
38
+ - **Paper:** [Arxiv (Coming Soon)]()
39
+ - **Project Page:** [https://haoyizhu.github.io/spa/](https://haoyizhu.github.io/spa/)
40
+
41
+ ## Citation
42
+ ```bib
43
+ @article{zhu2024spa,
44
+ title = {SPA: 3D Spatial-Awareness Enables Effective Embodied Representation},
45
+ author = {Zhu, Haoyi and and Yang, Honghui and Wang, Yating and Yang, Jiange and Wang, Limin and He, Tong},
46
+ journal = {arXiv preprint},
47
+ year = {2024},
48
+ }
49
+ ```