krakowiak-7b / README.md
szymonrucinski's picture
Update README.md
39914ce
|
raw
history blame
1.65 kB
metadata
library_name: peft

Introduction

Krakowiak-7B is a finetuned version of Meta's Llama2. It was trained on the modified and updated dataset originally created by Chris Ociepa containing ~ 50K instructions. Making it one of the best and biggest available LLM's. Name krakowiak refers to one of the most popular and characteristic Polish folk dances, with its very lively, even wild, tempo, and long, easy strides, demonstrating spirited abandon and elegance at the same time.

How to test it?

The model can be ran using the Huggingface library or in the browser using this Google Colab

Training procedure

The following bitsandbytes quantization config was used during training:

  • load_in_8bit: False
  • load_in_4bit: True
  • llm_int8_threshold: 6.0
  • llm_int8_skip_modules: None
  • llm_int8_enable_fp32_cpu_offload: False
  • llm_int8_has_fp16_weight: False
  • bnb_4bit_quant_type: nf4
  • bnb_4bit_use_double_quant: False
  • bnb_4bit_compute_dtype: bfloat16

The following bitsandbytes quantization config was used during training:

  • load_in_8bit: False
  • load_in_4bit: True
  • llm_int8_threshold: 6.0
  • llm_int8_skip_modules: None
  • llm_int8_enable_fp32_cpu_offload: False
  • llm_int8_has_fp16_weight: False
  • bnb_4bit_quant_type: nf4
  • bnb_4bit_use_double_quant: False
  • bnb_4bit_compute_dtype: bfloat16

Framework versions

  • PEFT 0.4.0

  • PEFT 0.4.0