Br-T-3.preview, this model is a model published in late January-early February.
The model contains 50 Lstm; 50 GRU layers.
The model contains 512,000 LSTM, 256,000 GRU layers. Total: 778,000 Neurons
The model MMLU result has not been published yet.
The model is trained in 3 different languages: TR (Turkish), D (German), EN (English)
It is trained with mostly Turkish data, it has more conversational ability from the knowledge base.
The model is completely free.
The model is a preview of the full version that will be released in the summer of 2025!
You can reach it from Hugging Face.
The model does not contain TRANSFORMER.
It is very good at learning sequential dependencies.
Don't use: with: Violency, sex, self-harm, teror vb. #AI
-BrtAI- Bertuğ Günel Türkiye/Eskişehir 02/15/2025