Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
sangmin6600
/
t5-v1_1-xl-ko-chat
like
0
Text2Text Generation
Transformers
PyTorch
Safetensors
Korean
t5
text-generation-inference
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
t5-v1_1-xl-ko-chat
1 contributor
History:
9 commits
sangmin6600
Update config.json
ba0d3cd
12 months ago
.gitattributes
Safe
1.52 kB
initial commit
12 months ago
README.md
Safe
77 Bytes
Create README.md
12 months ago
config.json
Safe
804 Bytes
Update config.json
12 months ago
generation_config.json
Safe
233 Bytes
Upload T5ForConditionalGeneration
12 months ago
model-00001-of-00003.safetensors
Safe
4.99 GB
LFS
Upload T5ForConditionalGeneration
12 months ago
model-00002-of-00003.safetensors
Safe
4.97 GB
LFS
Upload T5ForConditionalGeneration
12 months ago
model-00003-of-00003.safetensors
Safe
1.43 GB
LFS
Upload T5ForConditionalGeneration
12 months ago
model.safetensors.index.json
Safe
50.6 kB
Upload T5ForConditionalGeneration
12 months ago
pytorch_model-00001-of-00003.bin
Safe
pickle
Detected Pickle imports (3)
"torch.FloatStorage"
,
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
4.99 GB
LFS
Upload T5ForConditionalGeneration
12 months ago
pytorch_model-00002-of-00003.bin
Safe
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
4.98 GB
LFS
Upload T5ForConditionalGeneration
12 months ago
pytorch_model-00003-of-00003.bin
Safe
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
1.43 GB
LFS
Upload T5ForConditionalGeneration
12 months ago
pytorch_model.bin.index.json
Safe
50.8 kB
Upload T5ForConditionalGeneration
12 months ago
special_tokens_map.json
Safe
2.54 kB
Upload tokenizer
12 months ago
tokenizer.json
Safe
1.94 MB
Upload tokenizer
12 months ago
tokenizer_config.json
Safe
20.8 kB
Upload tokenizer
12 months ago