Post
795
I’ve been diving into the iRoPE architecture from Llama 4—a game-changer for long-context models! It interleaves local attention (with RoPE) for short contexts and global attention (with inference-time temp scaling) for long-range reasoning, aiming for infinite context. I’m going to try writing iRoPE—who wants to help?
Code: https://github.com/wassemgtk/iRoPE-try/blob/main/iRoPE.ipynb
Code: https://github.com/wassemgtk/iRoPE-try/blob/main/iRoPE.ipynb