THIS MODEL IS NOT QUITE FULLY FINISHED OR TESTED, PLEASE TAKE THIS INTO CONSIDERATION. --- license: apache-2.0 --- tags: - Composer - MosaicML - llm-foundry - AnimusOG - Oobabooga - KoboldAI - Text-Generation - Conversational - Uncensored --- # MPT-7B-StoryWriter-65k+ Quantized for [KoboldAI (4bit-fork)](https://github.com/0cc4m/koboldAI) ## How to Use ### This is meant to be used with the oobabooga text-generation-webui: [Oobabooga](https://github.com/oobabooga/text-generation-webui) ## webui.py command flags when starting Oobabooga: --trust-remote-code --model-type llama ### MPT-7B-StoryWriter-65k+ can extrapolate even beyond 65k tokens. ## Model Date May 15, 2023 ## Model License Apache-2.0 (commercial use permitted) ## Documentation * [Blog post: Introducing MPT-7B: A New Standard for Open-Source, Commercially Usable LLMs](https://www.mosaicml.com/blog/mpt-7b) * [Codebase (mosaicml/llm-foundry repo)](https://github.com/mosaicml/llm-foundry/) * Questions: Feel free to contact us via the [MosaicML Community Slack](https://join.slack.com/t/mosaicml-community/shared_invite/zt-1btms90mc-GipE2ufuPkKY0QBrmF3LSA)! ## Citation Please cite this model using the following format: ``` @online{MosaicML2023Introducing, author = {MosaicML NLP Team}, title = {Introducing MPT-7B: A New Standard for Open-Source, Commercially Usable LLMs}, year = {2023}, url = {www.mosaicml.com/blog/mpt-7b}, note = {Accessed: 2023-03-28}, % change this date urldate = {2023-03-28} % change this date } ```