ritikk jburtoft commited on
Commit
d6d7528
1 Parent(s): 02118ac

Update README.md (#1)

Browse files

- Update README.md (221667b2db085e48c3bbe5923802fe9ee7081358)


Co-authored-by: Jim Burtoft <[email protected]>

Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -43,7 +43,7 @@ This model has been compiled to run on an inf2.xlarge (the smallest Inferentia2
43
 
44
  ## Set up the environment
45
 
46
- First, use the [DLAMI image from Hugging Face](https://aws.amazon.com/marketplace/pp/prodview-gr3e6yiscria2). It has most of the utilities and drivers preinstalled. However, you will need to update transformers-neruonx from the source to get Mistral support.
47
 
48
 
49
  ```
@@ -52,9 +52,9 @@ python -m pip install git+https://github.com/aws-neuron/transformers-neuronx.git
52
 
53
  ## Running inference from this repository
54
 
55
- If you want to run a quick test or if the exact model you want to use is [HuggingFaceH4/zephyr-7b-beta](https://huggingface.co/HuggingFaceH4/zephyr-7b-beta), you can run it directly using the steps below. Otherwise, jump to the Compilation of other Mistral versions section.
56
 
57
- First, you will need a local copy of the library. This is because one of the nice things that the Hugging Face optimum library does is abstract local loads from repository loads. However, Mistral inference isn't supported yet.
58
 
59
 
60
  ```
 
43
 
44
  ## Set up the environment
45
 
46
+ First, use the [DLAMI image from Hugging Face](https://aws.amazon.com/marketplace/pp/prodview-gr3e6yiscria2). It has most of the utilities and drivers preinstalled. However, you will need to update transformers-neruonx from the source to get Mistral/Zephyr support.
47
 
48
 
49
  ```
 
52
 
53
  ## Running inference from this repository
54
 
55
+ If you want to run a quick test or if the exact model you want to use is [HuggingFaceH4/zephyr-7b-beta](https://huggingface.co/HuggingFaceH4/zephyr-7b-beta), you can run it directly using the steps below. Otherwise, jump to the Compilation of other Mistral/Zephyr versions section.
56
 
57
+ First, you will need a local copy of the library. This is because one of the nice things that the Hugging Face optimum library does is abstract local loads from repository loads. However, Mistral/Zephyr inference isn't supported yet.
58
 
59
 
60
  ```