Younes Belkada’s Post

View profile for Younes Belkada, graphic

Machine Learning Engineer

Did you know that you can use Hugging Face PEFT library to inject LoRA/AdaLoRA/IA3 into any PyTorch module? Simply use `inject_adapter_in_model` by passing the corresponding peft config and model. As simple as that! Read out more about the "Low Level API" of PEFT in the dedicated section of the documentation: https://lnkd.in/eGQPKGhV

  • No alternative text description for this image
Byamasu Patrick Paul

Founder | AI and ML Engineer | MSc

10mo

I have never tried this before, I can't wait to try it and have LoraAdapater injected into a pytorch module

Is it possible to target layers by the name? For instance if I want to target only query and value linear layers. Or it's something like "monkey patching" in the way that it replaces all linear layers with LoRA variants?

Like
Reply

That's cool, Younes! Any chances PEFT will support JAX? ;)

Like
Reply
Leon Lahoud

Cloud Architect - MBA - Deep Learner and aspiring writer

10mo

Can’t wait to try this

Like
Reply
See more comments

To view or add a comment, sign in

Explore topics