Add warning because of llama.cpp issue #9127
Browse files
README.md
CHANGED
@@ -27,7 +27,8 @@ tags:
|
|
27 |
|
28 |
---
|
29 |
|
30 |
-
|
|
|
31 |
|
32 |
> [!NOTE]
|
33 |
> This is a model that is assumed to perform well, but may require more testing and user feedback. Be aware, only models featured within the GUI of GPT4All, are curated and officially supported by Nomic. Use at your own risk.
|
|
|
27 |
|
28 |
---
|
29 |
|
30 |
+
> [!Warning]
|
31 |
+
> I cannot recommend this model at the present time. Users have reported problems with the model not stopping generating. See [Bug: phi 3.5 mini produces garbage past 4096 context #9127](https://github.com/ggerganov/llama.cpp/issues/9127)
|
32 |
|
33 |
> [!NOTE]
|
34 |
> This is a model that is assumed to perform well, but may require more testing and user feedback. Be aware, only models featured within the GUI of GPT4All, are curated and officially supported by Nomic. Use at your own risk.
|