Spaces:
Running
Running
Create README.md
Browse files
README.md
CHANGED
@@ -1,231 +1,10 @@
|
|
1 |
-
|
2 |
-
|
3 |
-
|
4 |
-
|
5 |
-
|
6 |
-
|
7 |
-
|
8 |
-
|
9 |
-
|
10 |
-
|
11 |
-
[](https://badge.fury.io/py/gpt-researcher)
|
12 |
-

|
13 |
-
[](https://colab.research.google.com/github/assafelovic/gpt-researcher/blob/master/docs/docs/examples/pip-run.ipynb)
|
14 |
-
[](https://hub.docker.com/r/gptresearcher/gpt-researcher)
|
15 |
-
[](https://twitter.com/assaf_elovic)
|
16 |
-
|
17 |
-
[English](README.md) | [δΈζ](README-zh_CN.md) | [ζ₯ζ¬θͺ](README-ja_JP.md) | [νκ΅μ΄](README-ko_KR.md)
|
18 |
-
|
19 |
-
</div>
|
20 |
-
|
21 |
-
# π GPT Researcher
|
22 |
-
|
23 |
-
**GPT Researcher is an autonomous agent designed for comprehensive web and local research on any given task.**
|
24 |
-
|
25 |
-
The agent produces detailed, factual, and unbiased research reports with citations. GPT Researcher provides a full suite of customization options to create tailor made and domain specific research agents. Inspired by the recent [Plan-and-Solve](https://arxiv.org/abs/2305.04091) and [RAG](https://arxiv.org/abs/2005.11401) papers, GPT Researcher addresses misinformation, speed, determinism, and reliability by offering stable performance and increased speed through parallelized agent work.
|
26 |
-
|
27 |
-
**Our mission is to empower individuals and organizations with accurate, unbiased, and factual information through AI.**
|
28 |
-
|
29 |
-
## Why GPT Researcher?
|
30 |
-
|
31 |
-
- Objective conclusions for manual research can take weeks, requiring vast resources and time.
|
32 |
-
- LLMs trained on outdated information can hallucinate, becoming irrelevant for current research tasks.
|
33 |
-
- Current LLMs have token limitations, insufficient for generating long research reports.
|
34 |
-
- Limited web sources in existing services lead to misinformation and shallow results.
|
35 |
-
- Selective web sources can introduce bias into research tasks.
|
36 |
-
|
37 |
-
## Demo
|
38 |
-
https://github.com/user-attachments/assets/2cc38f6a-9f66-4644-9e69-a46c40e296d4
|
39 |
-
|
40 |
-
## Architecture
|
41 |
-
|
42 |
-
The core idea is to utilize 'planner' and 'execution' agents. The planner generates research questions, while the execution agents gather relevant information. The publisher then aggregates all findings into a comprehensive report.
|
43 |
-
|
44 |
-
<div align="center">
|
45 |
-
<img align="center" height="600" src="https://github.com/assafelovic/gpt-researcher/assets/13554167/4ac896fd-63ab-4b77-9688-ff62aafcc527">
|
46 |
-
</div>
|
47 |
-
|
48 |
-
Steps:
|
49 |
-
* Create a task-specific agent based on a research query.
|
50 |
-
* Generate questions that collectively form an objective opinion on the task.
|
51 |
-
* Use a crawler agent for gathering information for each question.
|
52 |
-
* Summarize and source-track each resource.
|
53 |
-
* Filter and aggregate summaries into a final research report.
|
54 |
-
|
55 |
-
## Tutorials
|
56 |
-
- [How it Works](https://docs.gptr.dev/blog/building-gpt-researcher)
|
57 |
-
- [How to Install](https://www.loom.com/share/04ebffb6ed2a4520a27c3e3addcdde20?sid=da1848e8-b1f1-42d1-93c3-5b0b9c3b24ea)
|
58 |
-
- [Live Demo](https://www.loom.com/share/6a3385db4e8747a1913dd85a7834846f?sid=a740fd5b-2aa3-457e-8fb7-86976f59f9b8)
|
59 |
-
|
60 |
-
## Features
|
61 |
-
|
62 |
-
- π Generate detailed research reports using web and local documents.
|
63 |
-
- πΌοΈ Smart image scraping and filtering for reports.
|
64 |
-
- π Generate detailed reports exceeding 2,000 words.
|
65 |
-
- π Aggregate over 20 sources for objective conclusions.
|
66 |
-
- π₯οΈ Frontend available in lightweight (HTML/CSS/JS) and production-ready (NextJS + Tailwind) versions.
|
67 |
-
- π JavaScript-enabled web scraping.
|
68 |
-
- π Maintains memory and context throughout research.
|
69 |
-
- π Export reports to PDF, Word, and other formats.
|
70 |
-
|
71 |
-
## π Documentation
|
72 |
-
|
73 |
-
See the [Documentation](https://docs.gptr.dev/docs/gpt-researcher/getting-started/getting-started) for:
|
74 |
-
- Installation and setup guides
|
75 |
-
- Configuration and customization options
|
76 |
-
- How-To examples
|
77 |
-
- Full API references
|
78 |
-
|
79 |
-
## βοΈ Getting Started
|
80 |
-
|
81 |
-
### Installation
|
82 |
-
|
83 |
-
1. Install Python 3.11 or later. [Guide](https://www.tutorialsteacher.com/python/install-python).
|
84 |
-
2. Clone the project and navigate to the directory:
|
85 |
-
|
86 |
-
```bash
|
87 |
-
git clone https://github.com/assafelovic/gpt-researcher.git
|
88 |
-
cd gpt-researcher
|
89 |
-
```
|
90 |
-
|
91 |
-
3. Set up API keys by exporting them or storing them in a `.env` file.
|
92 |
-
|
93 |
-
```bash
|
94 |
-
export OPENAI_API_KEY={Your OpenAI API Key here}
|
95 |
-
export TAVILY_API_KEY={Your Tavily API Key here}
|
96 |
-
```
|
97 |
-
|
98 |
-
4. Install dependencies and start the server:
|
99 |
-
|
100 |
-
```bash
|
101 |
-
pip install -r requirements.txt
|
102 |
-
python -m uvicorn main:app --reload
|
103 |
-
```
|
104 |
-
|
105 |
-
Visit [http://localhost:8000](http://localhost:8000) to start.
|
106 |
-
|
107 |
-
For other setups (e.g., Poetry or virtual environments), check the [Getting Started page](https://docs.gptr.dev/docs/gpt-researcher/getting-started/getting-started).
|
108 |
-
|
109 |
-
## Run as PIP package
|
110 |
-
```bash
|
111 |
-
pip install gpt-researcher
|
112 |
-
|
113 |
-
```
|
114 |
-
### Example Usage:
|
115 |
-
```python
|
116 |
-
...
|
117 |
-
from gpt_researcher import GPTResearcher
|
118 |
-
|
119 |
-
query = "why is Nvidia stock going up?"
|
120 |
-
researcher = GPTResearcher(query=query, report_type="research_report")
|
121 |
-
# Conduct research on the given query
|
122 |
-
research_result = await researcher.conduct_research()
|
123 |
-
# Write the report
|
124 |
-
report = await researcher.write_report()
|
125 |
-
...
|
126 |
-
```
|
127 |
-
|
128 |
-
**For more examples and configurations, please refer to the [PIP documentation](https://docs.gptr.dev/docs/gpt-researcher/gptr/pip-package) page.**
|
129 |
-
|
130 |
-
|
131 |
-
## Run with Docker
|
132 |
-
|
133 |
-
> **Step 1** - [Install Docker](https://docs.gptr.dev/docs/gpt-researcher/getting-started/getting-started-with-docker)
|
134 |
-
|
135 |
-
> **Step 2** - Clone the '.env.example' file, add your API Keys to the cloned file and save the file as '.env'
|
136 |
-
|
137 |
-
> **Step 3** - Within the docker-compose file comment out services that you don't want to run with Docker.
|
138 |
-
|
139 |
-
```bash
|
140 |
-
docker-compose up --build
|
141 |
-
```
|
142 |
-
|
143 |
-
If that doesn't work, try running it without the dash:
|
144 |
-
```bash
|
145 |
-
docker compose up --build
|
146 |
-
```
|
147 |
-
|
148 |
-
|
149 |
-
> **Step 4** - By default, if you haven't uncommented anything in your docker-compose file, this flow will start 2 processes:
|
150 |
-
- the Python server running on localhost:8000<br>
|
151 |
-
- the React app running on localhost:3000<br>
|
152 |
-
|
153 |
-
Visit localhost:3000 on any browser and enjoy researching!
|
154 |
-
|
155 |
-
|
156 |
-
|
157 |
-
## π Research on Local Documents
|
158 |
-
|
159 |
-
You can instruct the GPT Researcher to run research tasks based on your local documents. Currently supported file formats are: PDF, plain text, CSV, Excel, Markdown, PowerPoint, and Word documents.
|
160 |
-
|
161 |
-
Step 1: Add the env variable `DOC_PATH` pointing to the folder where your documents are located.
|
162 |
-
|
163 |
-
```bash
|
164 |
-
export DOC_PATH="./my-docs"
|
165 |
-
```
|
166 |
-
|
167 |
-
Step 2:
|
168 |
-
- If you're running the frontend app on localhost:8000, simply select "My Documents" from the "Report Source" Dropdown Options.
|
169 |
-
- If you're running GPT Researcher with the [PIP package](https://docs.tavily.com/docs/gpt-researcher/pip-package), pass the `report_source` argument as "local" when you instantiate the `GPTResearcher` class [code sample here](https://docs.gptr.dev/docs/gpt-researcher/context/tailored-research).
|
170 |
-
|
171 |
-
|
172 |
-
## πͺ Multi-Agent Assistant
|
173 |
-
As AI evolves from prompt engineering and RAG to multi-agent systems, we're excited to introduce our new multi-agent assistant built with [LangGraph](https://python.langchain.com/v0.1/docs/langgraph/).
|
174 |
-
|
175 |
-
By using LangGraph, the research process can be significantly improved in depth and quality by leveraging multiple agents with specialized skills. Inspired by the recent [STORM](https://arxiv.org/abs/2402.14207) paper, this project showcases how a team of AI agents can work together to conduct research on a given topic, from planning to publication.
|
176 |
-
|
177 |
-
An average run generates a 5-6 page research report in multiple formats such as PDF, Docx and Markdown.
|
178 |
-
|
179 |
-
Check it out [here](https://github.com/assafelovic/gpt-researcher/tree/master/multi_agents) or head over to our [documentation](https://docs.gptr.dev/docs/gpt-researcher/multi_agents/langgraph) for more information.
|
180 |
-
|
181 |
-
## π₯οΈ Frontend Applications
|
182 |
-
|
183 |
-
GPT-Researcher now features an enhanced frontend to improve the user experience and streamline the research process. The frontend offers:
|
184 |
-
|
185 |
-
- An intuitive interface for inputting research queries
|
186 |
-
- Real-time progress tracking of research tasks
|
187 |
-
- Interactive display of research findings
|
188 |
-
- Customizable settings for tailored research experiences
|
189 |
-
|
190 |
-
Two deployment options are available:
|
191 |
-
1. A lightweight static frontend served by FastAPI
|
192 |
-
2. A feature-rich NextJS application for advanced functionality
|
193 |
-
|
194 |
-
For detailed setup instructions and more information about the frontend features, please visit our [documentation page](https://docs.gptr.dev/docs/gpt-researcher/frontend/frontend).
|
195 |
-
|
196 |
-
## π Contributing
|
197 |
-
We highly welcome contributions! Please check out [contributing](https://github.com/assafelovic/gpt-researcher/blob/master/CONTRIBUTING.md) if you're interested.
|
198 |
-
|
199 |
-
Please check out our [roadmap](https://trello.com/b/3O7KBePw/gpt-researcher-roadmap) page and reach out to us via our [Discord community](https://discord.gg/QgZXvJAccX) if you're interested in joining our mission.
|
200 |
-
<a href="https://github.com/assafelovic/gpt-researcher/graphs/contributors">
|
201 |
-
<img src="https://contrib.rocks/image?repo=assafelovic/gpt-researcher" />
|
202 |
-
</a>
|
203 |
-
## βοΈ Support / Contact us
|
204 |
-
- [Community Discord](https://discord.gg/spBgZmm3Xe)
|
205 |
-
- Author Email: [email protected]
|
206 |
-
|
207 |
-
## π‘ Disclaimer
|
208 |
-
|
209 |
-
This project, GPT Researcher, is an experimental application and is provided "as-is" without any warranty, express or implied. We are sharing codes for academic purposes under the Apache 2 license. Nothing herein is academic advice, and NOT a recommendation to use in academic or research papers.
|
210 |
-
|
211 |
-
Our view on unbiased research claims:
|
212 |
-
1. The main goal of GPT Researcher is to reduce incorrect and biased facts. How? We assume that the more sites we scrape the less chances of incorrect data. By scraping multiple sites per research, and choosing the most frequent information, the chances that they are all wrong is extremely low.
|
213 |
-
2. We do not aim to eliminate biases; we aim to reduce it as much as possible. **We are here as a community to figure out the most effective human/llm interactions.**
|
214 |
-
3. In research, people also tend towards biases as most have already opinions on the topics they research about. This tool scrapes many opinions and will evenly explain diverse views that a biased person would never have read.
|
215 |
-
|
216 |
-
---
|
217 |
-
|
218 |
-
<p align="center">
|
219 |
-
<a href="https://star-history.com/#assafelovic/gpt-researcher">
|
220 |
-
<picture>
|
221 |
-
<source media="(prefers-color-scheme: dark)" srcset="https://api.star-history.com/svg?repos=assafelovic/gpt-researcher&type=Date&theme=dark" />
|
222 |
-
<source media="(prefers-color-scheme: light)" srcset="https://api.star-history.com/svg?repos=assafelovic/gpt-researcher&type=Date" />
|
223 |
-
<img alt="Star History Chart" src="https://api.star-history.com/svg?repos=assafelovic/gpt-researcher&type=Date" />
|
224 |
-
</picture>
|
225 |
-
</a>
|
226 |
-
</p>
|
227 |
-
|
228 |
-
|
229 |
-
<p align="right">
|
230 |
-
<a href="#top">β¬οΈ Back to Top</a>
|
231 |
-
</p>
|
|
|
1 |
+
---
|
2 |
+
title: GPT Researcher
|
3 |
+
emoji: π§
|
4 |
+
colorFrom: blue
|
5 |
+
colorTo: green
|
6 |
+
sdk: gradio
|
7 |
+
sdk_version: "3.0.0"
|
8 |
+
app_file: app.py
|
9 |
+
pinned: false
|
10 |
+
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|