shisa-gamma-7b-v1 / README.md
leonardlin's picture
Update README.md
00ad931
|
raw
history blame
833 Bytes
metadata
license: apache-2.0
datasets:
  - augmxnt/ultra-orca-boros-en-ja-v1
language:
  - ja
  - en

shisa-gamma-7b-v1

For more information see our main Shisa 7B model

We applied a version of our fine-tune data set onto Japanese Stable LM Base Gamma 7B and it performed pretty well, just sharing since it might be of interest.

Check out our JA MT-Bench results.

Comparison vs shisa-7b-v1

Comparison vs other recently released JA models