Original model is here. Tagger for local environment is here.

# recipe
from transformers import AutoModelForCausalLM, AutoProcessor, BitsAndBytesConfig
import transformers
import torch
import json

model_id = 'gokaygokay/Florence-2-SD3-Captioner'
save_path = 'gokaygokay-Florence-2-SD3-Captioner-8bit'

processor = AutoProcessor.from_pretrained(model_id, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(
    model_id, 
    trust_remote_code=True,
    torch_dtype=torch.float32,
    low_cpu_mem_usage=True,
    quantization_config=BitsAndBytesConfig(
        load_in_8bit=True,
        llm_int8_threshold=6.0,
        llm_int8_enable_fp32_cpu_offload=True,
        llm_int8_skip_modules=['lm_head'],
    ),
)

processor.save_pretrained(save_path)
model.save_pretrained(save_path, safe_serialization=True)

config = {}
with open(f'{save_path}/config.json') as f:
    config = json.load(f)
config['vision_config']['model_type'] = 'davit'
with open(f'{save_path}/config.json', 'w') as f:
    json.dump(config, f, indent=2)
Downloads last month
15
Safetensors
Model size
271M params
Tensor type
F32
·
I8
·
Inference API
Inference API (serverless) has been turned off for this model.

Model tree for John6666/gokaygokay-Florence-2-SD3-Captioner-8bit

Quantized
(1)
this model

Datasets used to train John6666/gokaygokay-Florence-2-SD3-Captioner-8bit