Text Generation
KerasHub
Keras
bloom_560m_multi / README.md
Divyasreepat's picture
Update README.md with new model card content
d5f3e1c verified
|
raw
history blame
3.03 kB
metadata
library_name: keras-hub

Model Overview

BLOOM as described in as descriped in BLOOM: A 176B-Parameter Open-Access Multilingual Language Model, is a large language model published by BigScience. BLOOM is able to output coherent text in 46 languages and 13 programming languages. BLOOM models range in size from 0.5 billion to 3 billion parameters. See the model card below for benchmarks, data sources, and intended use cases.

Weights are released under the RAIL License. Keras model code is released under the Apache 2 License.

Links

Installation

Keras and KerasHub can be installed with:

pip install -U -q keras-hub
pip install -U -q keras>=3

Jax, TensorFlow, and Torch come preinstalled in Kaggle Notebooks. For instructions on installing them in another environment see the Keras Getting Started page.

Presets

The following model checkpoints are provided by the Keras team. Full code examples for each are available below.

Preset name Parameters Description
bloom_560m_multi 559M 560M base model
bloom_1.1b_multi 1.06B 1B base model
bloom_1.7b_multi 1.72B 1.7B base model
bloom_3b_multi 3B 3B base model
bloomz_560m_multi 559M 560m instruction-tuned model
bloomz_1.1b_multi 1.06B 1B instruction-tuned model
bloomz_1.7b_multi 1.72B 1.7B instruction-tuned model
bloomz_3b_multi 3B 3B instruction-tuned model

Prompts

The performance may vary depending on the prompt. For BLOOMZ models, we recommend making it very clear when the input stops to avoid the model trying to continue it. For example, the prompt "Translate to English: Je t'aime" without the full stop (.) at the end, may result in the model trying to continue the French sentence. Better prompts are e.g. "Translate to English: Je t'aime.", "Translate to English: Je t'aime. Translation:" "What is "Je t'aime." in English?", where it is clear for the model when it should answer. Further, we recommend providing the model as much context as possible. For example, if you want it to answer in Telugu, then tell the model, e.g. "Explain in a sentence in Telugu what is backpropagation in neural networks.".

Example Usage

Example Usage with Hugging Face URI