Divyasreepat commited on
Commit
be6d526
·
verified ·
1 Parent(s): 213ee98

Update README.md with new model card content

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -6,7 +6,7 @@ tags:
6
  - text-embedding
7
  pipeline_tag: text-classification
8
  ---
9
- ## Model Overview
10
  DistilBert is a set of language models published by HuggingFace. They are efficient, distilled version of BERT, and are intended for classification and embedding of text, not for text-generation. See the model card below for benchmarks, data sources, and intended use cases.
11
 
12
  Weights and Keras model code are released under the [Apache 2 License](https://github.com/keras-team/keras-hub/blob/master/LICENSE).
@@ -147,4 +147,4 @@ classifier = keras_hub.models.DistilBertClassifier.from_preset(
147
  preprocessor=None,
148
  )
149
  classifier.fit(x=features, y=labels, batch_size=2)
150
- ```
 
6
  - text-embedding
7
  pipeline_tag: text-classification
8
  ---
9
+ ### Model Overview
10
  DistilBert is a set of language models published by HuggingFace. They are efficient, distilled version of BERT, and are intended for classification and embedding of text, not for text-generation. See the model card below for benchmarks, data sources, and intended use cases.
11
 
12
  Weights and Keras model code are released under the [Apache 2 License](https://github.com/keras-team/keras-hub/blob/master/LICENSE).
 
147
  preprocessor=None,
148
  )
149
  classifier.fit(x=features, y=labels, batch_size=2)
150
+ ```