![](https://cdn-avatars.huggingface.co/v1/production/uploads/623ce1c6b66fedf374859fe7/Qw9Np_ESnnWWUm7HXleHR.jpeg)
lamm-mit/Llama-3.2-3B-Instruct-Sparse-GIN-orca-math-word-problems
Updated
•
10
•
1
We present an approach to modifying Transformer architectures by integrating graph-aware relational reasoning into the attention mechanism.