narugo commited on
Commit
6ba364c
·
verified ·
1 Parent(s): f871e65

Export model 'openai/clip-vit-large-patch14', on 2025-01-29 03:25:55 JST

Browse files
README.md CHANGED
@@ -3,6 +3,7 @@ pipeline_tag: zero-shot-classification
3
  base_model:
4
  - openai/clip-vit-base-patch16
5
  - openai/clip-vit-base-patch32
 
6
  language:
7
  - en
8
  tags:
@@ -17,10 +18,11 @@ ONNX exported version of CLIP models.
17
 
18
  # Models
19
 
20
- 2 models exported in total.
21
 
22
- | Name | Image (Params/FLOPS) | Image Size | Image Width (Enc/Emb) | Text (Params/FLOPS) | Text Width (Enc/Emb) | Created At |
23
- |:------------------------------------------------------------------------------------|:-----------------------|-------------:|:------------------------|:----------------------|:-----------------------|:-------------|
24
- | [openai/clip-vit-base-patch16](https://huggingface.co/openai/clip-vit-base-patch16) | 85.6M / 16.9G | 224 | 768 / 512 | 37.8M / 529.2M | 512 / 512 | 2022-03-03 |
25
- | [openai/clip-vit-base-patch32](https://huggingface.co/openai/clip-vit-base-patch32) | 87.4M / 4.4G | 224 | 768 / 512 | 37.8M / 529.2M | 512 / 512 | 2022-03-03 |
 
26
 
 
3
  base_model:
4
  - openai/clip-vit-base-patch16
5
  - openai/clip-vit-base-patch32
6
+ - openai/clip-vit-large-patch14
7
  language:
8
  - en
9
  tags:
 
18
 
19
  # Models
20
 
21
+ 3 models exported in total.
22
 
23
+ | Name | Image (Params/FLOPS) | Image Size | Image Width (Enc/Emb) | Text (Params/FLOPS) | Text Width (Enc/Emb) | Created At |
24
+ |:--------------------------------------------------------------------------------------|:-----------------------|-------------:|:------------------------|:----------------------|:-----------------------|:-------------|
25
+ | [openai/clip-vit-large-patch14](https://huggingface.co/openai/clip-vit-large-patch14) | 302.9M / 77.8G | 224 | 1024 / 768 | 85.1M / 1.2G | 768 / 768 | 2022-03-03 |
26
+ | [openai/clip-vit-base-patch16](https://huggingface.co/openai/clip-vit-base-patch16) | 85.6M / 16.9G | 224 | 768 / 512 | 37.8M / 529.2M | 512 / 512 | 2022-03-03 |
27
+ | [openai/clip-vit-base-patch32](https://huggingface.co/openai/clip-vit-base-patch32) | 87.4M / 4.4G | 224 | 768 / 512 | 37.8M / 529.2M | 512 / 512 | 2022-03-03 |
28
 
models.parquet CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:654807b800475bff877b4b1ab225d20be8c7ff24f9996b85ec2bfaa338823528
3
- size 8468
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ba5e4513b5196ffe8bd2c6269b626e30e2ee63169fced9b14ec5900831252b3b
3
+ size 8577
openai/clip-vit-large-patch14/image_encode.onnx ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b15a7dbd19afefb67ee5cc299e10140193c959507a381d17fcbed31f345fdbef
3
+ size 1216363788
openai/clip-vit-large-patch14/meta.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2d176a8eb5ccd85c5cab73af4d28b83ec3442048774b3289c4399aa28dd7a18c
3
+ size 449
openai/clip-vit-large-patch14/preprocessor.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7205c8fb09a2889e5abac115d76f9a949626ef911417c18a848d7aa81f34820d
3
+ size 826
openai/clip-vit-large-patch14/text_encode.onnx ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:16a656e1f17390c75112babfeb3963077865d0c66d2916d7be87040db428ab1c
3
+ size 494879632
openai/clip-vit-large-patch14/tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:77a1af6b322d9feab24f4b2365db74b999b7a3117c855da8a4626fb20b87221f
3
+ size 3642202