File size: 828 Bytes
49dd9e6 bd4a5fb 49dd9e6 ac6207c acbfe3e 489df9b 2894fef ac6207c ff161f4 2894fef 489df9b acbfe3e |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 |
---
license: apache-2.0
datasets:
- open-r1/OpenR1-Math-220k
- FreedomIntelligence/medical-o1-reasoning-SFT
language:
- en
- ar
metrics:
- accuracy
library_name: transformers
new_version: moelanoby/kok-baseV2
---
This model is made to be a good AI model with a custom architecture and it's going to be made from scratch entirely
and since this is a custom architecture you need to use the following python code
```python
from transformers import AutoConfig, AutoModel
config = AutoConfig.from_pretrained("moelanoby/kok-base")
model = AutoModel.from_config(config)
```
The previous script uses the correct implementation to do it
you can finetune this model to an LLM
you can use it for any use case and I won't care
and while you're at it you can support me with buy me a coffee :D
http://www.buymeacoffee.com/Moelanoby
:> |