abdullahalzubaer's picture
Update README.md
2115bed
|
raw
history blame
937 Bytes
metadata
language:
  - de
pipeline_tag: text-generation
tags:
  - bloom
  - lora
  - LLM

Github: https://github.com/abdullahalzubaer/bloom-6b4-clp-german-lora-inference

This lora adapter is from "asprenger/bloom-6b4-clp-german-instruct-lora". Thanks for the adapter!

I thought I was uploading the complete bloom-6b4-clp-german model with the adapter that I made to work but then after pushing the model I realized that it was only the adapter. Still exploring how this PEFT works with LoRA works :)

strict requirments for peft

peft==0.2.0

requirment

pip install transformers accelerate bitsandbytes peft==0.2.0

latest peft has breaking changes with the bloom-6b4-clp-german and this lora adapter, and the only way to get them both work is (I think) is to train the base model or the adapter again (I am not sure yet).

Reference: