cmp-nct commited on
Commit
8c988bb
1 Parent(s): bebb5f6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -5
README.md CHANGED
@@ -1,11 +1,12 @@
1
- Base PR: https://github.com/ggerganov/llama.cpp/pull/5267
2
- Note: integration in llama.cpp is partial at this point - it uses llava-1.5 preprocessing without applying that PR
3
-
 
4
  Important: Verify that processing a simple question with any image at least uses 1200 tokens of prompt processing, that shows that the new PR is in use.
5
  If your prompt is just 576 + a few tokens, you are using llava-1.5 code (or projector) and this is incompatible with llava-1.6
6
 
7
 
8
-
9
  The mmproj files are the embedded ViT's that came with llava-1.6, I've not compared them but given the previous releases from the team I'd be surprised if the ViT has not been fine tuned this time.
10
- If that's the case, using another ViT can cause issues.
11
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ ---
4
+ Required PR: https://github.com/ggerganov/llama.cpp/pull/5267
5
  Important: Verify that processing a simple question with any image at least uses 1200 tokens of prompt processing, that shows that the new PR is in use.
6
  If your prompt is just 576 + a few tokens, you are using llava-1.5 code (or projector) and this is incompatible with llava-1.6
7
 
8
 
 
9
  The mmproj files are the embedded ViT's that came with llava-1.6, I've not compared them but given the previous releases from the team I'd be surprised if the ViT has not been fine tuned this time.
10
+ If that's the case, using another ViT can cause issues.
11
 
12
+ Original models: https://github.com/haotian-liu/LLaVA