Fetching metadata from the HF Docker repository...
Delete spam_model.pkl
7b04cc2
-
1.52 kB
initial commit
-
1.71 kB
Upload 2 files
-
1.38 kB
Upload 2 files
-
236 Bytes
initial commit
-
6.16 kB
Update app.py
bp_model.pkl
Detected Pickle imports (5)
- "BackPropogation.BackPropogation",
- "numpy.core.multiarray._reconstruct",
- "numpy.dtype",
- "numpy.ndarray",
- "numpy.core.multiarray.scalar"
How to fix it?
4.3 kB
Upload 13 files
bp_tokeniser.pkl
Detected Pickle imports (4)
- "collections.OrderedDict",
- "collections.defaultdict",
- "keras.src.preprocessing.text.Tokenizer",
- "builtins.int"
How to fix it?
4.99 MB
Upload 13 files
-
457 kB
Upload dnn_model.h5
dnn_tokeniser.pkl
Detected Pickle imports (4)
- "collections.OrderedDict",
- "collections.defaultdict",
- "keras.src.preprocessing.text.Tokenizer",
- "builtins.int"
How to fix it?
4.53 MB
Upload 13 files
-
41.2 MB
Upload 2 files
lstm_tokeniser.pkl
Detected Pickle imports (4)
- "collections.OrderedDict",
- "collections.defaultdict",
- "keras.src.preprocessing.text.Tokenizer",
- "builtins.int"
How to fix it?
4.53 MB
Upload 13 files
ppn_model.pkl
Detected Pickle imports (4)
- "numpy.core.multiarray._reconstruct",
- "numpy.ndarray",
- "Perceptron.Perceptron",
- "numpy.dtype"
How to fix it?
2.27 kB
Upload 13 files
ppn_tokeniser.pkl
Detected Pickle imports (4)
- "collections.OrderedDict",
- "collections.defaultdict",
- "keras.src.preprocessing.text.Tokenizer",
- "builtins.int"
How to fix it?
4.85 MB
Upload 13 files
-
68 Bytes
Create requirements.txt
-
2.27 MB
Upload 2 files
spam_tokeniser.pkl
Detected Pickle imports (4)
- "collections.OrderedDict",
- "collections.defaultdict",
- "keras.src.preprocessing.text.Tokenizer",
- "builtins.int"
How to fix it?
290 kB
Upload 13 files
-
392 MB
Upload 13 files
-
392 MB
Upload 13 files