|
### Talk-Tuah-1 |
|
Talk-Tuah-1 is an 80 million parameter GPT trained on all of Hailey Welch's inspirational podcast 'Talk Tuah'. This SOTA frontier model is trained on 13 hours of 'Talk Tuah' on an A100 for ~30 minutes. The rationale was the discourse in the 'Talk Tuah' podcast is the most enlightened media that any human has created. |
|
Therefore, it should outperform any other LLM on any benchmark. With sufficient training and additional compute – Talk-Tuah-1 can outperform OpenAI and Anthropic's flagship models - o3 and sonnet. |
|
The architecture was adapted from Andrej Karpathy's nanogpt. |