Divyasreepat commited on
Commit
b5b58e4
1 Parent(s): 395d266

Update README.md with new model card content

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -10,7 +10,7 @@ tags:
10
  - text-to-text-generation
11
  pipeline_tag: text-generation
12
  ---
13
- ## Model Overview
14
  Gemma is Google's family of lightweight, state-of-the art open models built from the same research and technology used to create the Gemini models. Gemma models are available with and without instruction tuning and come in two sizes: 2 billion and 7 billion parameters. Gemma 1.1 is the latest weights refresh. See the model card below for benchmarks, data sources, and intended use cases.
15
 
16
  Weights are released under the [Gemma License](https://www.kaggle.com/models/google/gemma/license/consent). Keras model code is released under the [Apache 2 License](https://github.com/keras-team/keras-hub/blob/master/LICENSE).
@@ -202,4 +202,4 @@ gemma_lm = keras_hub.models.GemmaCausalLM.from_preset(
202
  preprocessor=None,
203
  )
204
  gemma_lm.fit(x=x, y=y, sample_weight=sw, batch_size=2)
205
- ```
 
10
  - text-to-text-generation
11
  pipeline_tag: text-generation
12
  ---
13
+ ### Model Overview
14
  Gemma is Google's family of lightweight, state-of-the art open models built from the same research and technology used to create the Gemini models. Gemma models are available with and without instruction tuning and come in two sizes: 2 billion and 7 billion parameters. Gemma 1.1 is the latest weights refresh. See the model card below for benchmarks, data sources, and intended use cases.
15
 
16
  Weights are released under the [Gemma License](https://www.kaggle.com/models/google/gemma/license/consent). Keras model code is released under the [Apache 2 License](https://github.com/keras-team/keras-hub/blob/master/LICENSE).
 
202
  preprocessor=None,
203
  )
204
  gemma_lm.fit(x=x, y=y, sample_weight=sw, batch_size=2)
205
+ ```