Added link to optimum neuron
Browse files
README.md
CHANGED
@@ -32,9 +32,9 @@ prompt_template: '<|system|>
|
|
32 |
---
|
33 |
# Neuronx model for Zephyr-7b-beta
|
34 |
|
35 |
-
This repository contains [AWS Inferentia2](https://aws.amazon.com/ec2/instance-types/inf2/) and [
|
36 |
|
37 |
-
However, this file includes an example of how to compile various versions of Zephyr. Support isn’t available yet (as of 1/9/2024) in the optimum-neuron framework, so we use the base transformers library.
|
38 |
|
39 |
These instructions closely follow the [Developer Guide](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/libraries/transformers-neuronx/transformers-neuronx-developer-guide.html#grouped-query-attention-gqa-support-beta). Look there for more detailed explanations, especially for the GQA settings.
|
40 |
|
|
|
32 |
---
|
33 |
# Neuronx model for Zephyr-7b-beta
|
34 |
|
35 |
+
This repository contains [AWS Inferentia2](https://aws.amazon.com/ec2/instance-types/inf2/) and [neuronx](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/) compatible checkpoints for [HuggingFaceH4/zephyr-7b-beta](https://huggingface.co/HuggingFaceH4/zephyr-7b-beta).
|
36 |
|
37 |
+
However, this file includes an example of how to compile various versions of Zephyr. Support isn’t available yet (as of 1/9/2024) in the [optimum neuron](https://huggingface.co/docs/optimum-neuron/index) framework, so we use the base transformers library.
|
38 |
|
39 |
These instructions closely follow the [Developer Guide](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/libraries/transformers-neuronx/transformers-neuronx-developer-guide.html#grouped-query-attention-gqa-support-beta). Look there for more detailed explanations, especially for the GQA settings.
|
40 |
|