-
CADCrafter: Generating Computer-Aided Design Models from Unconstrained Images
Paper โข 2504.04753 โข Published โข 1 -
Text2CAD: Generating Sequential CAD Models from Beginner-to-Expert Level Text Prompts
Paper โข 2409.17106 โข Published โข 5 -
Neural Kernel Surface Reconstruction
Paper โข 2305.19590 โข Published -
FlexiDreamer: Single Image-to-3D Generation with FlexiCubes
Paper โข 2404.00987 โข Published โข 23
BarryAdams
ChessWarrior
ยท
AI & ML interests
None yet
Recent Activity
liked
a model 25 days ago
prithivMLmods/Nanbeige4.1-3B-f32-GGUF liked
a model 26 days ago
Nanbeige/Nanbeige4.1-3B reacted
to
marksverdhei's
post with ๐ about 1 month ago
Poll: Will 2026 be the year of subquadratic attention?
The transformer architecture is cursed by its computational complexity.
It is why you run out of tokens and have to compact. But some would argue that this is a feature not a bug and that this is also why these models are so good. We've been doing a lot of research on trying to make equally good models that are computationally cheaper, But so far, none of the approaches have stood the test of time. Or so it seems.
Please vote, don't be shy. Remember that the Dunning-Kruger effect is very real, so the person who knows less about transformers than you is going to vote. We want everyone's opinion, no matter confidence.
๐ if you think at least one frontier model* will have no O(n^2) attention by the end of 2026
๐ฅ If you disagree
* Frontier models - models that match / outperform the flagship claude, gemini or chatgpt at the time on multiple popular benchmarks Organizations
None yet