Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
jaggernaut007
/
roberta-large-finetuned-abbr-finetuned-ner
like
0
Token Classification
Transformers
Safetensors
roberta
Generated from Trainer
Inference Endpoints
License:
mit
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
roberta-large-finetuned-abbr-finetuned-ner
1 contributor
History:
4 commits
jaggernaut007
End of training
183aeb1
verified
11 months ago
.gitattributes
Safe
1.52 kB
initial commit
about 1 year ago
README.md
Safe
1.36 kB
Model save
11 months ago
config.json
Safe
905 Bytes
Model save
11 months ago
model.safetensors
Safe
1.42 GB
LFS
Model save
11 months ago
special_tokens_map.json
Safe
125 Bytes
Model save
11 months ago
tokenizer.json
Safe
576 kB
Model save
11 months ago
tokenizer_config.json
Safe
1.14 kB
Model save
11 months ago
training_args.bin
pickle
Detected Pickle imports (8)
"transformers.trainer_utils.HubStrategy"
,
"transformers.training_args.OptimizerNames"
,
"transformers.training_args.TrainingArguments"
,
"accelerate.state.PartialState"
,
"transformers.trainer_utils.IntervalStrategy"
,
"transformers.trainer_utils.SchedulerType"
,
"accelerate.utils.dataclasses.DistributedType"
,
"torch.device"
How to fix it?
4.73 kB
LFS
Model save
11 months ago