A newer version of this model is available:
moelanoby/kok-baseV2
This model is made to be a good AI model with a custom architecture and it's going to be made from scratch entirely and since this is a custom architecture you need to use the following python code
from transformers import AutoConfig, AutoModel
config = AutoConfig.from_pretrained("moelanoby/kok-base")
model = AutoModel.from_config(config)
The previous script uses the correct implementation to do it you can finetune this model to an LLM you can use it for any use case and I won't care and while you're at it you can support me with buy me a coffee :D
http://www.buymeacoffee.com/Moelanoby
:>
- Downloads last month
- 17
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
HF Inference deployability: The model has no pipeline_tag.