PLaMo-100B: A Ground-Up Language Model Designed for Japanese Proficiency Paper • 2410.07563 • Published Oct 10 • 2
Gemma 2 JPN Release Collection A Gemma 2 2B model fine-tuned on Japanese text. It supports the Japanese language the same level of performance of EN only queries on Gemma 2. • 3 items • Updated 12 days ago • 26
MS MARCO Mined Triplets Collection These datasets contain MS MARCO Triplets gathered by mining hard negatives using various models. Each dataset has various subsets. • 14 items • Updated May 21 • 11
Qwen2.5 Collection Qwen2.5 language models, including pretrained and instruction-tuned models of 7 sizes, including 0.5B, 1.5B, 3B, 7B, 14B, 32B, and 72B. • 45 items • Updated 28 days ago • 444
LLM-jp: A Cross-organizational Project for the Research and Development of Fully Open Japanese LLMs Paper • 2407.03963 • Published Jul 4 • 15
Continual Pre-Training for Cross-Lingual LLM Adaptation: Enhancing Japanese Language Capabilities Paper • 2404.17790 • Published Apr 27 • 5
Building a Large Japanese Web Corpus for Large Language Models Paper • 2404.17733 • Published Apr 27 • 4
JaColBERT and Hard Negatives, Towards Better Japanese-First Embeddings for Retrieval: Early Technical Report Paper • 2312.16144 • Published Dec 26, 2023 • 3