Joshua Sundance Bailey
env file use
9578b22
|
raw
history blame
4.99 kB
metadata
title: langchain-streamlit-demo
emoji: 🦜
colorFrom: green
colorTo: red
sdk: docker
app_port: 7860
pinned: true
tags:
  - langchain
  - streamlit
  - docker

langchain-streamlit-demo

License: MIT python security: bandit Ruff Code style: black pre-commit Checked with mypy

Docker Docker Image Size (tag) Open HuggingFace Space

This project shows how to build a simple chatbot UI with Streamlit and LangChain.

This README was written by Claude 2, an LLM from Anthropic.

Features

  • Chat interface for talking to AI assistant
  • Supports models from
    • OpenAI
      • gpt-3.5-turbo
      • gpt-4
    • Anthropic
      • claude-instant-v1
      • claude-2
    • Anyscale Endpoints
      • meta-llama/Llama-2-7b-chat-hf
      • meta-llama/Llama-2-13b-chat-hf
      • meta-llama/Llama-2-70b-chat-hf
  • Streaming output of assistant responses
  • Leverages LangChain for dialogue and memory management
  • Integrates with LangSmith for tracing conversations
  • Allows giving feedback on assistant's responses
  • Tries reading API keys and default values from environment variables

Usage

Run on HuggingFace Spaces

Open HuggingFace Space

With Docker (pull from Docker Hub)

  1. Optional: Create a .env file based on .env-example
  2. Run in terminal:

docker run -p 7860:7860 joshuasundance/langchain-streamlit-demo:latest

or

docker run -p 7860:7860 --env-file .env joshuasundance/langchain-streamlit-demo:latest

  1. Open http://localhost:7860 in your browser

Docker Compose

  1. Clone the repo. Navigate to cloned repo directory
  2. Optional: Create a .env file based on .env-example
  3. Run in terminal:

docker compose up

or

docker compose up --env-file env

  1. Run in terminal: docker compose up
  2. Open http://localhost:7860 in your browser

Kubernetes

  1. Clone the repo. Navigate to cloned repo directory
  2. Create a .env file based on .env-example
  3. Run in terminal: cd kubernetes && kubectl apply -f resources.yaml
  4. Get the IP address for your new service: kubectl get service langchain-streamlit-demo

Configuration

  • Select a model from the dropdown
  • Optional: Create a .env file based on .env-example, or
    • Enter an API key for the relevant provider
    • Optionally enter a LangSmith API key to enable conversation tracing
  • Customize the assistant prompt and temperature

Code Overview

  • langchain-streamlit-demo/app.py - Main Streamlit app definition
  • langchain-streamlit-demo/llm_stuff.py - LangChain helper functions
  • Dockerfile, docker-compose.yml: Docker deployment
  • kubernetes/: Kubernetes deployment files
  • .github/workflows/: CI/CD workflows

Deployment

The app is packaged as a Docker image for easy deployment. It is published to Docker Hub and Hugging Face Spaces:

CI/CD workflows in .github/workflows handle building and publishing the image.

Links

TODO

  1. More customization / parameterization in sidebar