|
--- |
|
library_name: keras-hub |
|
license: gemma |
|
pipeline_tag: text-generation |
|
extra_gated_heading: Access Gemma on Hugging Face |
|
extra_gated_prompt: To access Gemma on Hugging Face, you’re required to review and |
|
agree to Google’s usage license. To do this, please ensure you’re logged-in to Hugging |
|
Face and click below. Requests are processed immediately. |
|
extra_gated_button_content: Acknowledge license |
|
--- |
|
|
|
# Gemma 1 |
|
|
|
**Google Model Page**: [Gemma](https://ai.google.dev/gemma/docs) |
|
|
|
This model card corresponds to the latest 2B version of the Gemma model in Keras. |
|
|
|
|
|
Keras models can be used with JAX, PyTorch or TensorFlow as numerical backends. |
|
JAX, with its support for SPMD model paralellism, is recommended for large models. |
|
For more information: [distributed training with Keras and JAX](https://keras.io/guides/distribution/). |
|
|
|
|
|
You can find other models in the Gemma family here: |
|
|
|
| | Base | Instruct | |
|
|----|----------------------------------------------------|----------------------------------------------------------------------| |
|
| 2B | [**gemma-2b-keras**](https://huggingface.co/google/gemma-2b-keras) | [gemma-1.1-2b-it-keras](https://huggingface.co/google/gemma-1.1-2b-it-keras) | |
|
| 7B | [gemma-7b-keras](https://huggingface.co/google/gemma-7b-keras) | [gemma-1.1-7b-it-keras](https://huggingface.co/google/gemma-1.1-7b-it-keras) | |
|
|
|
|
|
For more information about the model, visit https://huggingface.co/google/gemma-2b. |
|
|
|
**Resources and Technical Documentation**: |
|
|
|
* [Responsible Generative AI Toolkit](https://ai.google.dev/responsible) |
|
* [Gemma on Kaggle](https://www.kaggle.com/models/google/gemma) |
|
* [Gemma on Vertex Model Garden](https://console.cloud.google.com/vertex-ai/publishers/google/model-garden/335?version=gemma-2b-gg-hf) |
|
|
|
**Terms of Use**: [Terms](https://www.kaggle.com/models/google/gemma/license/consent/verify/huggingface?returnModelRepoId=google/gemma-2b-keras) |
|
|
|
**Authors**: Google |
|
|
|
## Loading the model |
|
|
|
```python |
|
import keras_nlp |
|
|
|
gemma_lm = keras_nlp.models.GemmaCausalLM.from_preset("hf://google/gemma-2b-keras") |
|
``` |