Spaces:
Running
CRAWLGPT π€
A powerful web content crawler with GPT-powered summarization and chat capabilities. CRAWLGPT extracts content from URLs, stores it in a vector database (FAISS), and enables natural language querying of the stored content. It combines modern web crawling technology with advanced language models to help you extract, analyze, and interact with web content intelligently.
π Features
Web Crawling
Async-based crawling powered by crawl4ai and Playwright.
Includes configurable rate limiting and content validation.Content Processing
Automatically chunks large texts, generates embeddings, and summarizes text via the Groq API.Chat Interface
Streamlit-based UI with a user-friendly chat panel.
Supports summarized or full-text retrieval (RAG) for context injection.Data Management
Stores content in a local or in-memory vector database (FAISS) for efficient retrieval.
Tracks usage metrics and supports import/export of system state.Testing
Comprehensive unit and integration tests using Pythonβsunittest
framework.
π₯ Demo
Example of CRAWLGPT in action!
π§ Requirements
- Python >= 3.8
- Operating System: OS Independent
- Required packages are handled by the setup script.
π Quick Start
Clone the Repository:
cd CRAWLGPT
Run the Setup Script:
python -m setup_env
This script installs dependencies, creates a virtual environment, and prepares the project.
Update Your Environment Variables:
- Create or modify the
.env
file. - Add your Groq API key and Ollama API key. Learn how to get API keys.
GROQ_API_KEY=your_groq_api_key_here OLLAMA_API_TOKEN=your_ollama_api_key_here
- Create or modify the
Activate the Virtual Environment:
source .venv/bin/activate # On Unix/macOS .venv\Scripts\activate # On Windows
Run the Application:
python -m streamlit run src/crawlgpt/ui/chat_app.py
π¦ Dependencies
Core Dependencies
streamlit==1.41.1
groq==0.15.0
sentence-transformers==3.3.1
faiss-cpu==1.9.0.post1
crawl4ai==0.4.247
python-dotenv==1.0.1
pydantic==2.10.5
aiohttp==3.11.11
beautifulsoup4==4.12.3
numpy==2.2.0
tqdm==4.67.1
playwright>=1.41.0
asyncio>=3.4.3
Development Dependencies
pytest==8.3.4
pytest-mockito==0.0.4
black==24.2.0
isort==5.13.0
flake8==7.0.0
ποΈ Project Structure
crawlgpt/
βββ src/
β βββ crawlgpt/
β βββ core/
β β βββ DatabaseHandler.py
β β βββ LLMBasedCrawler.py
β β βββ SummaryGenerator.py
β βββ ui/
β β βββ chat_app.py
β β βββ chat_ui.py
β βββ utils/
β βββ content_validator.py
β βββ data_manager.py
β βββ helper_functions.py
β βββ monitoring.py
β βββ progress.py
βββ tests/
β βββ test_core/
β βββ test_database_handler.py
β βββ test_integration.py
β βββ test_llm_based_crawler.py
β βββ test_summary_generator.py
βββ .gitignore
βββ LICENSE
βββ README.md
βββ Docs
βββ pyproject.toml
βββ pytest.ini
βββ setup_env.py
π§ͺ Testing
Run all tests
python -m pytest
The tests include unit tests for core functionality and integration tests for end-to-end workflows.
π License
This project is licensed under the MIT License - see the LICENSE file for details.
π Links
π§‘ Acknowledgments
- Inspired by the potential of GPT models for intelligent content processing.
- Special thanks to the creators of Crawl4ai, Groq, FAISS, and Playwright for their powerful tools.
π¨βπ» Author
- Jatin Mehra ([email protected])
π€ Contributing
Contributions are welcome! Please feel free to submit a Pull Request. For major changes, open an issue first to discuss your proposal.
- Fork the Project.
- Create your Feature Branch:
git checkout -b feature/AmazingFeature`
- Commit your Changes:
git commit -m 'Add some AmazingFeature
- Push to the Branch:
git push origin feature/AmazingFeature
- Open a Pull Request.