Access Gemma on Hugging Face

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

To access Gemma on Hugging Face, you’re required to review and agree to Google’s usage license. To do this, please ensure you’re logged-in to Hugging Face and click below. Requests are processed immediately.

Log in or Sign Up to review the conditions and access this model content.

Gemma Model Card

This repository corresponds to the research Gemma repository in Jax. If you're looking for the transformers JAX implementation, visit this page.

Model Page: Gemma

This model card corresponds to the 2B instruct version of the Gemma model for usage with flax. For more information about the model, visit https://huggingface.co/google/gemma-2b-it.

Resources and Technical Documentation:

Terms of Use: Terms

Authors: Google

Loading the model

To download the weights and tokenizer, run:

from huggingface_hub import snapshot_download

local_dir = <PATH>
snapshot_download(repo_id="google/gemma-2b-it-flax", local_dir=local_dir)

Then download this script from the gemma GitHub repository and call python sampling.py with the --path_checkpoint and --path_tokenizer arguments pointing to your local download path.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Examples
Inference API (serverless) does not yet support jax models for this pipeline type.

Collections including google/gemma-2b-it-flax