Scott Vachalek
svachalek
AI & ML interests
None yet
Recent Activity
liked
a Space
28 days ago
tencent/Hunyuan3D-2
liked
a Space
about 2 months ago
black-forest-labs/FLUX.1-schnell
liked
a model
2 months ago
strangerzonehf/Flux-Isometric-3D-LoRA
Organizations
None yet
svachalek's activity
What context size when using a 24GB VRAM card (4090) is best?
3
#1 opened 3 months ago
by
clevnumb