Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
HamidRezaAttar
/
gpt2-product-description-generator
like
14
Text Generation
Transformers
PyTorch
English
gpt2
text-generation-inference
Inference Endpoints
arxiv:
1706.03762
License:
apache-2.0
Model card
Files
Files and versions
Community
2
Train
Deploy
Use this model
9c8bf9e
gpt2-product-description-generator
1 contributor
History:
13 commits
SFconvertbot
Adding `safetensors` variant of this model
9c8bf9e
verified
about 1 month ago
.gitattributes
1.31 kB
Adding `safetensors` variant of this model
about 1 month ago
.gitignore
13 Bytes
UPDATE .gitignore
almost 3 years ago
README.md
2.53 kB
Update README.md
7 months ago
config.json
907 Bytes
First model version
almost 3 years ago
dls_eng_Batch4.pkl
Unsafe
pickle
Detected Pickle imports (28)
"fastcore.transform.Pipeline"
,
"tokenizers.Tokenizer"
,
"torch.device"
,
"__builtin__.getattr"
,
"pathlib.PosixPath"
,
"numpy.dtype"
,
"fastai.text.data.LMTensorText"
,
"__builtin__.unicode"
,
"fastai.data.core.DataLoaders"
,
"fastai.text.data._maybe_first"
,
"fastai.text.data.LMDataLoader"
,
"random.Random"
,
"fastcore.imports.noop"
,
"fastai.data.core.TfmdLists"
,
"_codecs.encode"
,
"numpy.core.multiarray._reconstruct"
,
"__main__.TransformersTokenizer"
,
"__builtin__.tuple"
,
"fastai.torch_core.Chunks"
,
"tokenizers.models.Model"
,
"transformers.models.gpt2.tokenization_gpt2_fast.GPT2TokenizerFast"
,
"torch.Tensor"
,
"fastcore.foundation.L"
,
"fastcore.xtras.ReindexCollection"
,
"fastai.data.load._wif"
,
"numpy.core.multiarray.scalar"
,
"fastai.data.load._FakeLoader"
,
"numpy.ndarray"
How to fix it?
138 MB
LFS
First model version
almost 3 years ago
history_epoch1.csv
382 Bytes
LFS
First model version
almost 3 years ago
model.safetensors
510 MB
LFS
Adding `safetensors` variant of this model
about 1 month ago
pytorch_model.bin
pickle
Detected Pickle imports (4)
"collections.OrderedDict"
,
"torch.ByteStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
510 MB
LFS
First model version
almost 3 years ago
tokenizer.json
1.36 MB
add tokenizer.json
almost 3 years ago