tmmluplus / README.md
ikala-ray's picture
Update README.md
7dac5b8
|
raw
history blame
1.36 kB
metadata
license: other
license_name: creative-commons-by-nc
task_categories:
  - question-answering
language:
  - zh
tags:
  - traditional chinese
  - finance
  - medical
  - taiwan
  - benchmark
  - zh-tw
  - zh-hant
pretty_name: tmmlu++
size_categories:
  - 100K<n<1M

TMMLU+ : Large scale traditional chinese massive multitask language understanding

A close-up image of a neat paper note with a white background. The text 'TMMLU+' is written horizontally across the center of the note in bold, black. Join us to work in multimodal LLM : https://ikala.ai/recruit/

We present TMMLU+ a traditional Chinese massive multitask language understanding dataset. TMMLU+ is a multiple-choice question-answering dataset with 66 subjects from elementary to professional level.

TMMLU+ dataset is 6 times larger and contains more balanced subjects compared to the previous version, TMMLU. We included benchmark results in TMMLU+ from closed-source models and 20 open-weight Chinese large language models of parameters ranging from 1.8B to 72B. Benchmark results show Traditional Chinese variants still lag behind those trained on Simplified Chinese major models.