Can someone please help me to understand how can I use this Model card?

#15
by arkaprovob - opened

I would like to download the model card named "gpt4-x-alpaca-30b-ggml-q5_1.bin" to my computer and use Python programming to make queries using this model and then, I want to train the model with my own data. I am new to the field of large language models (LLM) and don't have any previous experience. Can someone please guide me on how, to begin with these requirements?

I can understand that this is probably not the right place, but right now, I am feeling so helpless, trying every possible way to find a solution to my queries.

https://github.com/oobabooga/text-generation-webui

and by all means, follow the instructions and it will work. You must account hardware requirements.
This model needs a 32G+ on CPU, and about almost 23ish on a GPU.

@fblgit Thank you for your reply. I am aware of oobabooga and have been experimenting with it for the past couple of days. I plan to write a program that uses this model and adds more domain-specific knowledge to it by adding more training data, then query from it (customize it for my own needs) and share it with others if it is successful.

arkaprovob changed discussion status to closed

Sign up or log in to comment