Waseem AlShikh's picture

Waseem AlShikh

wassemgtk

AI & ML interests

Multi-modal, Palmyra LLMs, Knowledge Graph

Recent Activity

Organizations

Writer's profile picture Social Post Explorers's profile picture

Posts 5

view post
Post
2682
I’ve been diving into the iRoPE architecture from Llama 4—a game-changer for long-context models! It interleaves local attention (with RoPE) for short contexts and global attention (with inference-time temp scaling) for long-range reasoning, aiming for infinite context. I’m going to try writing iRoPE—who wants to help?

Code: https://github.com/wassemgtk/iRoPE-try/blob/main/iRoPE.ipynb

Articles 1

Article

Leveraging Hugging Face for complex generative AI use cases