Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
laion
/
CLIP-ViT-g-14-laion2B-s12B-b42K
like
42
Follow
LAION eV
356
OpenCLIP
PyTorch
Safetensors
clip
arxiv:
1910.04867
License:
mit
Model card
Files
Files and versions
Community
3
Use this model
b36bdd3
CLIP-ViT-g-14-laion2B-s12B-b42K
4 contributors
History:
4 commits
rwightman
HF staff
mishig
HF staff
Add widget example input (
#1
)
b36bdd3
about 2 years ago
.gitattributes
Safe
1.38 kB
initial commit
over 2 years ago
README.md
Safe
7.42 kB
Add widget example input (#1)
about 2 years ago
config.json
Safe
4.61 kB
Add bin files
over 2 years ago
open_clip_pytorch_model.bin
Safe
pickle
Detected Pickle imports (3)
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
What is a pickle import?
5.47 GB
LFS
Add bin files
over 2 years ago
preprocessor_config.json
Safe
316 Bytes
Update README add tokenizer/vocab/preprocessor cfg
over 2 years ago
pytorch_model.bin
Safe
pickle
Detected Pickle imports (4)
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.LongStorage"
,
"collections.OrderedDict"
What is a pickle import?
5.47 GB
LFS
Add bin files
over 2 years ago
special_tokens_map.json
Safe
389 Bytes
Update README add tokenizer/vocab/preprocessor cfg
over 2 years ago
tokenizer.json
Safe
2.22 MB
Update README add tokenizer/vocab/preprocessor cfg
over 2 years ago
tokenizer_config.json
Safe
568 Bytes
Update README add tokenizer/vocab/preprocessor cfg
over 2 years ago
vocab.json
Safe
862 kB
Update README add tokenizer/vocab/preprocessor cfg
over 2 years ago