Why does this take so long to load?
Every time I try to run it, it spends so long just to load the model. So long that it times out.
Every time I try to run it, it spends so long just to load the model. So long that it times out.
If you're referring to the Hosted Inference API, that's because there is no GPU provisioned for this model and the model is huge, so it will take very very long.
If you want to run it, you need to download the model and run it on your own hardware, sorry :(
Here are some guidelines for running it: https://huggingface.co/bigscience/bloomz/discussions/18#636b6ad958a8f9348d0ab82c
We should probably disable the widget as it may be confusing then
Ah, got it.
Thanks. I was just attempting to try it out. If it does what I think it does, then it could be just as good if not better than GPT-3.
Nice work, I love open source.
Even if I can’t run it :(
FYI removed the widget to prevent more confusion about this: https://huggingface.co/bigscience/bloomz-p3/commit/51f3d0d7079a37501554eb7ce2558012bb96d062