prasadsachin
commited on
Commit
•
901d938
1
Parent(s):
808ebc9
Update README.md
Browse files
README.md
CHANGED
@@ -5,6 +5,7 @@ language:
|
|
5 |
- en
|
6 |
tags:
|
7 |
- text-classification
|
|
|
8 |
---
|
9 |
## Model Overview
|
10 |
DistilBert is a set of language models published by HuggingFace. They are efficient, distilled version of BERT, and are intended for classification and embedding of text, not for text-generation. See the model card below for benchmarks, data sources, and intended use cases.
|
|
|
5 |
- en
|
6 |
tags:
|
7 |
- text-classification
|
8 |
+
pipeline_tag: text-classification
|
9 |
---
|
10 |
## Model Overview
|
11 |
DistilBert is a set of language models published by HuggingFace. They are efficient, distilled version of BERT, and are intended for classification and embedding of text, not for text-generation. See the model card below for benchmarks, data sources, and intended use cases.
|