Transformers
PyTorch
Japanese
MambaSan-370m / README.md
loiccabannes's picture
Update README.md
a5269d0 verified
---
license: apache-2.0
datasets:
- SkelterLabsInc/JaQuAD
language:
- ja
---
MambaSan-370m 🐍
MambaSan-370m is the first chat Japanese language model based on a state-space model architecture (Mamba).
The model is based on Albert Gu's and Tri Dao's work Mamba: Linear-Time Sequence Modeling with Selective State Spaces (paper) as well as their model implementation. .
The Code used for pretraining will soon be published on my github: https://github.com/lcabannes
Citation
bibtex
@misc{lcabannes2024MambaSan-370m,
title = {MambaSan-370m},
author = {Loïc Cabannes},
year = {2024},
howpublished = {HuggingFace},
url = {https://huggingface.co/loiccabannes/MambaSan-370m/}
}