# Retrieval Augmented QA ! This example shows the power of Retrieval Augmented QA using a Vector database. More specifically, we first load source documents about black holes and subsequently chunk them and send each of those chunks to the text-embedding-ada-002 OpenAI API endpoint. Next, we define a Vector batabase locally to store the text/vector representations for the chunks as key/value pairs. Using custom role prompts, we force the LLM to reply "I don't know." when it does not have enough context about an user query. However, in case the query is about black holes, the LLM gets relevant passages from the source documents and returns them.