Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Buckets new
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up
Z Li's picture
1 4 1

Z Li

csroyli
21world's profile picture SeanLee97's profile picture Ujjwal-Tyagi's profile picture
Β·
  • csroyli

AI & ML interests

None yet

Recent Activity

repliedto SeanLee97's post about 9 hours ago
Our lab recently released a paper where we introduce ShadowPEFT, a new Parameter-Efficient Fine-Tuning (PEFT) paradigm tailored for edge computing scenarios. Unlike traditional approaches such as LoRA and its variants, which inject trainable parameters directly into the weights of Transformer, requiring tight coupling with the backbone. ShadowPEFT instead enhances the frozen large base model by adding a lightweight, centralized, pretrainable, and detachable Shadow network. This shadow network operates in parallel with the base model, delivering learned corrections to each decoder layer. Because the shadow module is architecturally decoupled from the backbone, it can be independently trained, stored, and deployed, benefiting edge computing scenarios and edge-cloud collaboration computing. - HF Paper: https://huggingface.co/papers/2604.19254 - GitHub: https://github.com/ShadowLLM/shadow-peft - HF Collection: https://huggingface.co/collections/shadow-llm/shadow-peft-models
commentedon a paper about 9 hours ago
ShadowPEFT: Shadow Network for Parameter-Efficient Fine-Tuning
commentedon a paper 1 day ago
ShadowPEFT: Shadow Network for Parameter-Efficient Fine-Tuning
View all activity

Organizations

WhereIsAI's profile picture Apocalypse-AGI-DAO's profile picture ShadowLLM's profile picture

csroyli 's datasets

None public yet
Company
TOS Privacy About Careers
Website
Models Datasets Spaces Pricing Docs