kk
kk3dmax
AI & ML interests
None yet
Recent Activity
liked
a model
5 days ago
Skywork/Skywork-OR1-32B-Preview
liked
a model
22 days ago
alibaba-pai/Wan2.1-Fun-14B-Control
liked
a model
24 days ago
bartowski/deepseek-ai_DeepSeek-V3-0324-GGUF
Organizations
None yet
kk3dmax's activity
The hyper-FLUX.1-dev-8steps-lora can not used in diffusers
3
1
#74 opened 4 months ago
by
LHJ0
Diffusors missing config.json file
13
#3 opened 8 months ago
by
jspaun
Crash while loading tokenizer
5
#1 opened 7 months ago
by
legraphista

Will there be a 2048 token length version?
4
#7 opened over 1 year ago
by
kk3dmax
有时候会吐出 训练集文本.
10
#2 opened over 1 year ago
by
kk3dmax
What is the ETA for Mistral-embed? And what is the max token length of this emb model?
3
#8 opened over 1 year ago
by
kk3dmax
Is there a ETA for large version?
1
6
#31 opened over 1 year ago
by
kk3dmax
Is there a way to support > 512 token length?
7
#5 opened over 1 year ago
by
kk3dmax
TypeError: Pooling.__init__() got an unexpected keyword argument 'pooling_mode_weightedmean_tokens'
2
#4 opened over 1 year ago
by
kk3dmax
How to set the max length from 512 to 2048?
3
#11 opened over 1 year ago
by
kk3dmax
ModuleNotFoundError: No module named 'transformers_modules.TheBloke.WizardLM-33B-V1'
4
#2 opened almost 2 years ago
by
kk3dmax
ModuleNotFoundError: No module named 'transformers_modules.TheBloke.WizardLM-33B-V1'
4
#2 opened almost 2 years ago
by
kk3dmax
ModuleNotFoundError: No module named 'transformers_modules.TheBloke.WizardLM-33B-V1'
4
#2 opened almost 2 years ago
by
kk3dmax