elijah0528
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -1,4 +1,4 @@
|
|
1 |
### Talk-Tuah-1
|
2 |
-
Talk-Tuah-1 is an 80 million parameter GPT trained on all of Hailey Welch's inspirational podcast 'Talk Tuah'. This SOTA frontier model is trained on 13 hours of 'Talk Tuah' on an A100 for
|
3 |
Therefore, it should outperform any other LLM on any benchmark. With sufficient training and additional compute – Talk-Tuah-1 can outperform OpenAI and Anthropic's flagship models - o3 and sonnet.
|
4 |
The architecture was adapted from Andrej Karpathy's nanogpt.
|
|
|
1 |
### Talk-Tuah-1
|
2 |
+
Talk-Tuah-1 is an 80 million parameter GPT trained on all of Hailey Welch's inspirational podcast 'Talk Tuah'. This SOTA frontier model is trained on 13 hours of 'Talk Tuah' on an A100 for ~30 minutes. The rationale was the discourse in the 'Talk Tuah' podcast is the most enlightened media that any human has created.
|
3 |
Therefore, it should outperform any other LLM on any benchmark. With sufficient training and additional compute – Talk-Tuah-1 can outperform OpenAI and Anthropic's flagship models - o3 and sonnet.
|
4 |
The architecture was adapted from Andrej Karpathy's nanogpt.
|