Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
PrunaAI
/
dicta-il-dictalm-7b-QUANTO-int8bit-smashed
like
0
Follow
Pruna AI
137
Transformers
pruna-ai
Inference Endpoints
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
2410892
dicta-il-dictalm-7b-QUANTO-int8bit-smashed
1 contributor
History:
3 commits
sharpenb
d22543029ac9a914debe12dd360e76f76e188543d045ea990745234810191367
2410892
verified
6 months ago
.gitattributes
1.52 kB
initial commit
6 months ago
README.md
5.3 kB
d22543029ac9a914debe12dd360e76f76e188543d045ea990745234810191367
6 months ago
merges.txt
1.27 MB
d22543029ac9a914debe12dd360e76f76e188543d045ea990745234810191367
6 months ago
model.pt
pickle
Detected Pickle imports (26)
"torch._C._nn.gelu"
,
"torch.nn.modules.sparse.Embedding"
,
"__builtin__.set"
,
"transformers_modules.dicta-il.dictalm-7b.c233431901e34e82235e058ff75053a292547e79.modeling_megatron_gpt.MegatronGPTMLP"
,
"torch.device"
,
"torch.int8"
,
"transformers_modules.dicta-il.dictalm-7b.c233431901e34e82235e058ff75053a292547e79.configuration_megatron_gpt.MegatronGPTConfig"
,
"torch.float16"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.nn.modules.dropout.Dropout"
,
"transformers_modules.dicta-il.dictalm-7b.c233431901e34e82235e058ff75053a292547e79.modeling_megatron_gpt.MegatronGPTLayer"
,
"transformers_modules.dicta-il.dictalm-7b.c233431901e34e82235e058ff75053a292547e79.modeling_megatron_gpt.MegatronGPTAttention"
,
"transformers_modules.dicta-il.dictalm-7b.c233431901e34e82235e058ff75053a292547e79.modeling_megatron_gpt.MegatronGPTLayerNorm"
,
"collections.OrderedDict"
,
"torch.HalfStorage"
,
"torch.nn.modules.container.ModuleList"
,
"torch.BoolStorage"
,
"torch.FloatStorage"
,
"transformers_modules.dicta-il.dictalm-7b.c233431901e34e82235e058ff75053a292547e79.modeling_megatron_gpt.MegatronGPTForCausalLM"
,
"transformers_modules.dicta-il.dictalm-7b.c233431901e34e82235e058ff75053a292547e79.modeling_megatron_gpt.MegatronGPTModel"
,
"torch._utils._rebuild_parameter"
,
"transformers_modules.dicta-il.dictalm-7b.c233431901e34e82235e058ff75053a292547e79.modeling_megatron_gpt.MegatronGPTRotaryEmbedding"
,
"transformers.activations.GELUActivation"
,
"quanto.nn.qlinear.QLinear"
,
"transformers.generation.configuration_utils.GenerationConfig"
,
"quanto.tensor.qtype.qtype"
How to fix it?
11.1 GB
LFS
9d2efffbae8848f5f9ee1175638cacbebb164642891e075812e837a12291681c
6 months ago
smash_config.json
1.02 kB
d22543029ac9a914debe12dd360e76f76e188543d045ea990745234810191367
6 months ago
special_tokens_map.json
567 Bytes
d22543029ac9a914debe12dd360e76f76e188543d045ea990745234810191367
6 months ago
tokenizer.json
3.87 MB
d22543029ac9a914debe12dd360e76f76e188543d045ea990745234810191367
6 months ago
tokenizer_config.json
883 Bytes
d22543029ac9a914debe12dd360e76f76e188543d045ea990745234810191367
6 months ago
vocab.json
1.65 MB
d22543029ac9a914debe12dd360e76f76e188543d045ea990745234810191367
6 months ago