The NLP Course is becoming the LLM Course!

Education has always been at the heart of Hugging Face’s mission to democratize AI and we’re doubling down on that by giving hf.co/learn a big upgrade! Our NLP course has been a go-to resource for the open-source AI community for the past 3 years, and it’s now time for a refresh. We’re updating and expanding it to keep up with all the exciting stuff happening in AI (which is not easy when there are breakthroughs every week!)
We felt the excitement during the experimental smol-course and the massive agents course, where 100k students registered to learn about AI agents in a new, fun, and interactive way!
Over the last few months we’ve expanded the "NLP" course with new chapters, including fine-tuning LLMs and building reasoning models like Deepseek R1. These newer chapters don’t fit under the banner of ‘NLP’ so we’ve searched for a more relevant and modern title and landed on The LLM course.
What’s going to happen to the NLP course material?
We will maintain the existing material that focuses on classic NLP tasks like classification, named entity-recognition and retrieval. These topics are important because:
- We don’t need LLMs for everything!
- Students still benefit from these simpler tasks that run locally and are easy to interpret.
In fact, over the coming months we will update and modernise these classic chapters to include approaches like Sentence Transformers, updates in Zero Shot classification, and ModernBert.
Will there be new chapters?
Yes. We are focusing on adding new chapters that focus on making state-of-the-art research accessible to a broader audience and compatible with tools like transformers, Spaces, and the Hugging Face Hub. For example, we’ll add chapters on fine-tuning, inference, and retrieval.
We want new chapters to be even more open source and closer to the community, instead of focusing solely on Hugging Face libraries. The transformers
library has become the de-facto reference for LLM modelling code, but models are consumed, fine-tuned and adopted in a variety of frameworks. Which is great! For example, in Chapter 11 on fine-tuning LLMs or Chapter 12 on reasoning models, we collaborated with libraries and tools that greatly complement the Hugging Face libraries. This includes:
Over the coming months, we plan to expand our collaboration to include more authors, maintainers, and companies. We want our materials to cover the best tools you’re using!
Will there be interactive exercises and live sessions?
Yes, when they’re useful to students. We will focus on reliable written material and coding exercises that will remain useful for years to come. But for the popular topics where students are engaging synchronously, we’ll build interactive exercises and host live sessions.
What’s next?
To get involved with the course, follow the organization on the hub and start a discussion if you'd like to join, suggest interactive units, or propose a live session.