A newer version of the Gradio SDK is available:
5.35.0
title: Frugal AI Agent
emoji: 🌍
colorFrom: green
colorTo: blue
sdk: gradio
sdk_version: 5.33.0
app_file: app.py
pinned: true
short_description: Estimate carbon footprint and get frugal AI code suggestions
tags:
- agent-demo-track
- mcp-server-track
thumbnail: >-
https://cdn-uploads.huggingface.co/production/uploads/675973b3138cfdc3f3f4f85b/7Iw5hINn5IJRWMOEo3yY_.jpeg
Frugalize it
This AI agent takes a Python code snippet, uses CodeCarbon to estimate its CO2 emissions, and then suggests modifications using frugal AI techniques such as pruning. It also provides general advice on how to make your code more energy-efficient.
Learn more about the principles behind this project by visiting the Frugal AI Challenge.
Special thanks to Anthropic for the free credits, which helped me run more tests and improve my agent!
Getting Started
Add an Anthropic API key in the repository's secrets variables to enable the agent.
Use the example prompts provided:
Both examples use a context prompt beforehand to explain to the agent, Fruggy, what is expected from it. This context prompt is followed by a sample user query:
- The first example can be used directly:
"What are you capable of?"
- Video demo agent-demo-track 1 It runs a sample code (from the Frugal AI Challenge), estimates its energy consumption, and suggests optimizations using pruning and quantization. - The second example:
"Here is my code, {code}, please give me frugal alternatives"
Video demo agent-demo-track 2 This cannot be used directly. Replace{code}
with your own Python code. The agent will then return frugal alternatives and optimization suggestions.
You can also use the MCP client to call the agent: Video demo mcp-server-track
- The first example can be used directly:
⚠️ Note: This agent is a proof of concept. There are many improvements and features that could be added. Besides, the example codes are quite basic and simple, both to ensure low CPU usage for generating the results and because they run directly in the cloud rather than locally in these examples.
Contribute
All types of contributions are, of course, welcome! Feel free to submit pull requests or contact me with any questions or feedback via the Hugging Face Discord (@cindydelage_51846).
What's next ?
As mentioned, there's still a lot to do, but the first improvements could include:
- Adding the context prompt used in the examples as a system prompt for the agent
- Enabling the Gradio interface to import a code file
CodeCarbon citation:
@software{benoit_courty_2024_11171501, author = {Benoit Courty and Victor Schmidt and Sasha Luccioni and Goyal-Kamal and MarionCoutarel and Boris Feld and Jérémy Lecourt and LiamConnell and Amine Saboni and Inimaz and supatomic and Mathilde Léval and Luis Blanche and Alexis Cruveiller and ouminasara and Franklin Zhao and Aditya Joshi and Alexis Bogroff and Hugues de Lavoreille and Niko Laskaris and Edoardo Abati and Douglas Blank and Ziyao Wang and Armin Catovic and Marc Alencon and Michał Stęchły and Christian Bauer and Lucas Otávio N. de Araújo and JPW and MinervaBooks}, title = {mlco2/codecarbon: v2.4.1}, month = may, year = 2024, publisher = {Zenodo}, version = {v2.4.1}, doi = {10.5281/zenodo.11171501}, url = {https://doi.org/10.5281/zenodo.11171501} }
Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference