--- language: - uk - en tags: - t5 --- The aim is to compress the mT5-large model to leave only the Ukrainian language and some basic English. Reproduced the similar result (but with another language) from [this](https://towardsdatascience.com/how-to-adapt-a-multilingual-t5-model-for-a-single-language-b9f94f3d9c90) medium article. Results: - 1.2B params -> 779M params (37%) - 250K tokens -> 8900 tokens - 4.6GB size model -> 2.9GB size model