metadata
library_name: keras-hub
This is a DistilBert
model uploaded using the KerasHub library and can be used with JAX, TensorFlow, and PyTorch backends.
Model config:
- name: distil_bert_backbone
- trainable: True
- vocabulary_size: 30522
- num_layers: 6
- num_heads: 12
- hidden_dim: 768
- intermediate_dim: 3072
- dropout: 0.1
- max_sequence_length: 512
This model card has been generated automatically and should be completed by the model author. See Model Cards documentation for more information.