Br-T-3.preview, this model is a model published in late January-early February.

The model contains 50 Lstm; 50 GRU layers.

The model contains 512,000 LSTM, 256,000 GRU layers. Total: 778,000 Neurons

The model MMLU result has not been published yet.

The model is trained in 3 different languages: TR (Turkish), D (German), EN (English)

It is trained with mostly Turkish data, it has more conversational ability from the knowledge base.

The model is completely free.

The model is a preview of the full version that will be released in the summer of 2025!

You can reach it from Hugging Face.

The model does not contain TRANSFORMER.

It is very good at learning sequential dependencies.

Don't use: with: Violency, sex, self-harm, teror vb. #AI

-BrtAI- Bertuğ Günel Türkiye/Eskişehir 02/15/2025

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Dataset used to train Bertug1911/Br-T-3-pre