Model Info:

Original model: BlinkDL/rwkv-6-world

You can run this model with ai00_server.

Although ai00_rwkv_server is mainly for lowend PC, you can run it on servers which are support VULKAN.

To try it in Colab:

You should install libnvidia-gl-* and vulkan driver:

!apt -y install libnvidia-gl-535 libvulkan1

The 7B model use about 14.7G VRAM. T4 is enough to load it.

One more thing:

These models are censored. You can start your prompts with [SDA] to jailbreak.

It's something like developer mode.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model's library.