library_name: keras-hub | |
This is a [`DistilBert` model](https://keras.io/api/keras_hub/models/distil_bert) uploaded using the KerasHub library and can be used with JAX, TensorFlow, and PyTorch backends. | |
Model config: | |
* **name:** distil_bert_backbone | |
* **trainable:** True | |
* **vocabulary_size:** 30522 | |
* **num_layers:** 6 | |
* **num_heads:** 12 | |
* **hidden_dim:** 768 | |
* **intermediate_dim:** 3072 | |
* **dropout:** 0.1 | |
* **max_sequence_length:** 512 | |
This model card has been generated automatically and should be completed by the model author. See [Model Cards documentation](https://huggingface.co/docs/hub/model-cards) for more information. | |