Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
openphonemizer
/
autoreg-ckpt
like
0
Follow
OpenPhonemizer
1
Model card
Files
Files and versions
Community
main
autoreg-ckpt
1 contributor
History:
2 commits
mrfakename
Upload folder using huggingface_hub
4a940e9
verified
9 months ago
.gitattributes
Safe
1.52 kB
initial commit
9 months ago
best_model.pt
pickle
Detected Pickle imports (7)
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
,
"collections.OrderedDict"
,
"dp.preprocessing.text.Preprocessor"
,
"__builtin__.set"
,
"dp.preprocessing.text.LanguageTokenizer"
,
"dp.preprocessing.text.SequenceTokenizer"
How to fix it?
286 MB
LFS
Upload folder using huggingface_hub
9 months ago
best_model_no_optim.pt
pickle
Detected Pickle imports (7)
"torch._utils._rebuild_tensor_v2"
,
"dp.preprocessing.text.LanguageTokenizer"
,
"dp.preprocessing.text.SequenceTokenizer"
,
"__builtin__.set"
,
"collections.OrderedDict"
,
"dp.preprocessing.text.Preprocessor"
,
"torch.FloatStorage"
How to fix it?
117 MB
LFS
Upload folder using huggingface_hub
9 months ago
latest_model.pt
pickle
Detected Pickle imports (7)
"torch.FloatStorage"
,
"__builtin__.set"
,
"dp.preprocessing.text.LanguageTokenizer"
,
"torch._utils._rebuild_tensor_v2"
,
"dp.preprocessing.text.SequenceTokenizer"
,
"dp.preprocessing.text.Preprocessor"
,
"collections.OrderedDict"
How to fix it?
286 MB
LFS
Upload folder using huggingface_hub
9 months ago
model_step_10k.pt
pickle
Detected Pickle imports (7)
"dp.preprocessing.text.Preprocessor"
,
"dp.preprocessing.text.LanguageTokenizer"
,
"collections.OrderedDict"
,
"torch.FloatStorage"
,
"__builtin__.set"
,
"torch._utils._rebuild_tensor_v2"
,
"dp.preprocessing.text.SequenceTokenizer"
How to fix it?
286 MB
LFS
Upload folder using huggingface_hub
9 months ago
model_step_20k.pt
pickle
Detected Pickle imports (7)
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
,
"collections.OrderedDict"
,
"dp.preprocessing.text.Preprocessor"
,
"__builtin__.set"
,
"dp.preprocessing.text.LanguageTokenizer"
,
"dp.preprocessing.text.SequenceTokenizer"
How to fix it?
286 MB
LFS
Upload folder using huggingface_hub
9 months ago