--- library_name: keras-hub --- ### Model Overview DistilBert is a set of language models published by HuggingFace. They are efficient, distilled version of BERT, and are intended for classification and embedding of text, not for text-generation. See the model card below for benchmarks, data sources, and intended use cases. Weights and Keras model code are released under the [Apache 2 License](https://github.com/keras-team/keras-hub/blob/master/LICENSE). ## Links * [DistilBert Quickstart Notebook](https://www.kaggle.com/code/matthewdwatson/distilbert-quickstart) * [DistilBert API Documentation](https://keras.io/api/keras_hub/models/distil_bert/) * [DistilBert Model Card](https://huggingface.co/distilbert/distilbert-base-uncased) * [KerasHub Beginner Guide](https://keras.io/guides/keras_hub/getting_started/) * [KerasHub Model Publishing Guide](https://keras.io/guides/keras_hub/upload/) ## Installation Keras and KerasHub can be installed with: ``` pip install -U -q keras-hub pip install -U -q keras>3 ``` Jax, TensorFlow, and Torch come preinstalled in Kaggle Notebooks. For instruction on installing them in another environment see the [Keras Getting Started](https://keras.io/getting_started/) page. ## Presets The following model checkpoints are provided by the Keras team. Full code examples for each are available below. | Preset name | Parameters | Description | |-----------------------------|------------|--------------------------------------------------------| | distil_bert_base_en_uncased | 66.36M | 6-layer model where all input is lowercased. | | distil_bert_base_en | 65.19M | 6-layer model where case is maintained. | | distil_bert_base_multi | 134.73M | 6-layer multi-linguage model where case is maintained. | ### Example Usage ```python import keras import keras_hub import numpy as np ``` Raw string data. ```python features = ["The quick brown fox jumped.", "I forgot my homework."] labels = [0, 3] # Use a shorter sequence length. preprocessor = keras_hub.models.DistilBertPreprocessor.from_preset( "distil_bert_base_en_uncased", sequence_length=128, ) # Pretrained classifier. classifier = keras_hub.models.DistilBertClassifier.from_preset( "distil_bert_base_en_uncased", num_classes=4, preprocessor=preprocessor, ) classifier.fit(x=features, y=labels, batch_size=2) # Re-compile (e.g., with a new learning rate) classifier.compile( loss=keras.losses.SparseCategoricalCrossentropy(from_logits=True), optimizer=keras.optimizers.Adam(5e-5), jit_compile=True, ) # Access backbone programmatically (e.g., to change `trainable`). classifier.backbone.trainable = False # Fit again. classifier.fit(x=features, y=labels, batch_size=2) ``` Preprocessed integer data. ```python features = { "token_ids": np.ones(shape=(2, 12), dtype="int32"), "padding_mask": np.array([[1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0]] * 2) } labels = [0, 3] # Pretrained classifier without preprocessing. classifier = keras_hub.models.DistilBertClassifier.from_preset( "distil_bert_base_en_uncased", num_classes=4, preprocessor=None, ) classifier.fit(x=features, y=labels, batch_size=2) ``` ## Example Usage with Hugging Face URI ```python import keras import keras_hub import numpy as np ``` Raw string data. ```python features = ["The quick brown fox jumped.", "I forgot my homework."] labels = [0, 3] # Use a shorter sequence length. preprocessor = keras_hub.models.DistilBertPreprocessor.from_preset( "hf://keras/distil_bert_base_en_uncased", sequence_length=128, ) # Pretrained classifier. classifier = keras_hub.models.DistilBertClassifier.from_preset( "hf://keras/distil_bert_base_en_uncased", num_classes=4, preprocessor=preprocessor, ) classifier.fit(x=features, y=labels, batch_size=2) # Re-compile (e.g., with a new learning rate) classifier.compile( loss=keras.losses.SparseCategoricalCrossentropy(from_logits=True), optimizer=keras.optimizers.Adam(5e-5), jit_compile=True, ) # Access backbone programmatically (e.g., to change `trainable`). classifier.backbone.trainable = False # Fit again. classifier.fit(x=features, y=labels, batch_size=2) ``` Preprocessed integer data. ```python features = { "token_ids": np.ones(shape=(2, 12), dtype="int32"), "padding_mask": np.array([[1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0]] * 2) } labels = [0, 3] # Pretrained classifier without preprocessing. classifier = keras_hub.models.DistilBertClassifier.from_preset( "hf://keras/distil_bert_base_en_uncased", num_classes=4, preprocessor=None, ) classifier.fit(x=features, y=labels, batch_size=2) ```