gemma-7b-it-flax / README.md
BenjaminB's picture
BenjaminB HF staff
Update README.md
88e42fc verified
|
raw
history blame
1.94 kB
metadata
library_name: jax
tags:
  - gemma_jax
extra_gated_heading: Access Gemma on Hugging Face
extra_gated_prompt: >-
  To access Gemma on Hugging Face, you’re required to review and agree to
  Google’s usage license. To do this, please ensure you’re logged-in to Hugging
  Face and click below. Requests are processed immediately.
extra_gated_button_content: Acknowledge license
license: other
license_name: gemma-terms-of-use
license_link: https://ai.google.dev/gemma/terms
pipeline_tag: text-generation

Gemma Model Card

This repository corresponds to the research Gemma repository in Jax. If you're looking for the transformers JAX implementation, visit this page.

Model Page: Gemma

This model card corresponds to the 7B instruct version of the Gemma model for usage with flax. For more information about the model, visit https://huggingface.co/google/gemma-7b-it.

Resources and Technical Documentation:

Terms of Use: Terms

Authors: Google

Loading the model

To download the weights and tokenizer, run:

from huggingface_hub import snapshot_download

local_dir = <PATH>
snapshot_download(repo_id="google/gemma-7b-it-flax", local_dir=local_dir)

Then download this script from the gemma GitHub repository and call python sampling.py with the --path_checkpoint and --path_tokenizer arguments pointing to your local download path.