Update README.md with new model card content
Browse files
README.md
CHANGED
@@ -13,7 +13,7 @@ Weights are released under the [RAIL License](https://www.licenses.ai/ai-license
|
|
13 |
|
14 |
## Links
|
15 |
|
16 |
-
* [BLOOM Quickstart Notebook](https://www.kaggle.com/code/
|
17 |
* [BLOOM API Documentation](https://keras.io/api/keras_hub/models/bloom/)
|
18 |
* [BLOOM Model Card](https://huggingface.co/bigscience/bloom)
|
19 |
* [KerasHub Beginner Guide](https://keras.io/guides/keras_hub/getting_started/)
|
@@ -49,3 +49,55 @@ The following model checkpoints are provided by the Keras team. Full code exampl
|
|
49 |
|
50 |
The performance may vary depending on the prompt. For BLOOMZ models, we recommend making it very clear when the input stops to avoid the model trying to continue it. For example, the prompt "Translate to English: Je t'aime" without the full stop (.) at the end, may result in the model trying to continue the French sentence. Better prompts are e.g. "Translate to English: Je t'aime.", "Translate to English: Je t'aime. Translation:" "What is "Je t'aime." in English?", where it is clear for the model when it should answer. Further, we recommend providing the model as much context as possible. For example, if you want it to answer in Telugu, then tell the model, e.g. "Explain in a sentence in Telugu what is backpropagation in neural networks.".
|
51 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
13 |
|
14 |
## Links
|
15 |
|
16 |
+
* [BLOOM Quickstart Notebook](https://www.kaggle.com/code/laxmareddypatlolla/bloom-quickstart)
|
17 |
* [BLOOM API Documentation](https://keras.io/api/keras_hub/models/bloom/)
|
18 |
* [BLOOM Model Card](https://huggingface.co/bigscience/bloom)
|
19 |
* [KerasHub Beginner Guide](https://keras.io/guides/keras_hub/getting_started/)
|
|
|
49 |
|
50 |
The performance may vary depending on the prompt. For BLOOMZ models, we recommend making it very clear when the input stops to avoid the model trying to continue it. For example, the prompt "Translate to English: Je t'aime" without the full stop (.) at the end, may result in the model trying to continue the French sentence. Better prompts are e.g. "Translate to English: Je t'aime.", "Translate to English: Je t'aime. Translation:" "What is "Je t'aime." in English?", where it is clear for the model when it should answer. Further, we recommend providing the model as much context as possible. For example, if you want it to answer in Telugu, then tell the model, e.g. "Explain in a sentence in Telugu what is backpropagation in neural networks.".
|
51 |
|
52 |
+
## Example Usage
|
53 |
+
```python
|
54 |
+
|
55 |
+
import os
|
56 |
+
|
57 |
+
os.environ["KERAS_BACKEND"] = "jax"
|
58 |
+
|
59 |
+
import keras
|
60 |
+
import keras_hub
|
61 |
+
|
62 |
+
# When running only inference, bfloat16 saves memory usage significantly.
|
63 |
+
keras.config.set_floatx("bfloat16")
|
64 |
+
|
65 |
+
bloom_lm = keras_hub.models.BloomCausalLM.from_preset(
|
66 |
+
"bloom_1.1b_multi"
|
67 |
+
)
|
68 |
+
bloom_lm.summary()
|
69 |
+
|
70 |
+
outputs = bloom_lm.generate([
|
71 |
+
"What is Keras?",
|
72 |
+
], max_length=512)
|
73 |
+
|
74 |
+
for output in outputs:
|
75 |
+
print(output)
|
76 |
+
```
|
77 |
+
|
78 |
+
## Example Usage with Hugging Face URI
|
79 |
+
|
80 |
+
```python
|
81 |
+
|
82 |
+
import os
|
83 |
+
|
84 |
+
os.environ["KERAS_BACKEND"] = "jax"
|
85 |
+
|
86 |
+
import keras
|
87 |
+
import keras_hub
|
88 |
+
|
89 |
+
# When running only inference, bfloat16 saves memory usage significantly.
|
90 |
+
keras.config.set_floatx("bfloat16")
|
91 |
+
|
92 |
+
bloom_lm = keras_hub.models.BloomCausalLM.from_preset(
|
93 |
+
"hf://keras/bloom_1.1b_multi"
|
94 |
+
)
|
95 |
+
bloom_lm.summary()
|
96 |
+
|
97 |
+
outputs = bloom_lm.generate([
|
98 |
+
"What is Keras?",
|
99 |
+
], max_length=512)
|
100 |
+
|
101 |
+
for output in outputs:
|
102 |
+
print(output)
|
103 |
+
```
|