Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
mjwong
/
whisper-large-v3-singlish
like
0
Automatic Speech Recognition
Transformers
Safetensors
English
whisper
Eval Results
License:
apache-2.0
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
whisper-large-v3-singlish
Ctrl+K
Ctrl+K
1 contributor
History:
3 commits
mjwong
Update README.md
67ff69c
verified
12 days ago
.gitattributes
Safe
1.52 kB
initial commit
14 days ago
README.md
6.88 kB
Update README.md
12 days ago
added_tokens.json
Safe
34.6 kB
Upload model weights and tokenizer
13 days ago
config.json
Safe
1.28 kB
Upload model weights and tokenizer
13 days ago
generation_config.json
Safe
3.9 kB
Upload model weights and tokenizer
13 days ago
merges.txt
Safe
494 kB
Upload model weights and tokenizer
13 days ago
model-00001-of-00002.safetensors
4.99 GB
LFS
Upload model weights and tokenizer
13 days ago
model-00002-of-00002.safetensors
1.18 GB
LFS
Upload model weights and tokenizer
13 days ago
model.safetensors.index.json
Safe
112 kB
Upload model weights and tokenizer
13 days ago
normalizer.json
Safe
52.7 kB
Upload model weights and tokenizer
13 days ago
preprocessor_config.json
Safe
340 Bytes
Upload model weights and tokenizer
13 days ago
special_tokens_map.json
Safe
2.19 kB
Upload model weights and tokenizer
13 days ago
tokenizer_config.json
Safe
283 kB
Upload model weights and tokenizer
13 days ago
training_args.bin
pickle
Detected Pickle imports (10)
"torch.device"
,
"transformers.training_args_seq2seq.Seq2SeqTrainingArguments"
,
"transformers.trainer_utils.SchedulerType"
,
"transformers.trainer_utils.HubStrategy"
,
"accelerate.state.PartialState"
,
"accelerate.utils.dataclasses.DistributedType"
,
"transformers.training_args.OptimizerNames"
,
"transformers.trainer_utils.SaveStrategy"
,
"transformers.trainer_pt_utils.AcceleratorConfig"
,
"transformers.trainer_utils.IntervalStrategy"
How to fix it?
5.43 kB
LFS
Upload model weights and tokenizer
13 days ago
vocab.json
Safe
1.04 MB
Upload model weights and tokenizer
13 days ago