Joshua Sundance Bailey commited on
Commit
a95f086
2 Parent(s): 8f8542a 67e3a9b

python:3.11-slim-bookworm

Browse files

python:3.11-slim-bookworm, env file use, kubernetes, misc

.dockerignore ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ .env
2
+ .env-example
3
+ .git/
4
+ .github
5
+ .gitignore
6
+ .idea
7
+ .mypy_cache
8
+ .pre-commit-config.yaml
9
+ .ruff_cache
10
+ Dockerfile
11
+ kubernetes
12
+ docker-compose.yml
13
+ junk/
.gitignore CHANGED
@@ -21,7 +21,6 @@
21
  .coverage
22
  .coverage.*
23
  .dmypy.json
24
- .dockerignore
25
  .eggs/
26
  .env
27
  .hypothesis/
 
21
  .coverage
22
  .coverage.*
23
  .dmypy.json
 
24
  .eggs/
25
  .env
26
  .hypothesis/
.pre-commit-config.yaml CHANGED
@@ -32,7 +32,7 @@ repos:
32
  - id: check-docstring-first
33
  - id: check-executables-have-shebangs
34
  - id: check-json
35
- - id: check-yaml
36
  - id: debug-statements
37
  - id: fix-byte-order-marker
38
  - id: detect-private-key
 
32
  - id: check-docstring-first
33
  - id: check-executables-have-shebangs
34
  - id: check-json
35
+ # - id: check-yaml
36
  - id: debug-statements
37
  - id: fix-byte-order-marker
38
  - id: detect-private-key
Dockerfile CHANGED
@@ -1,4 +1,4 @@
1
- FROM python:3.11-slim-buster
2
 
3
  RUN adduser --uid 1000 --disabled-password --gecos '' appuser
4
  USER 1000
 
1
+ FROM python:3.11-slim-bookworm
2
 
3
  RUN adduser --uid 1000 --disabled-password --gecos '' appuser
4
  USER 1000
README.md CHANGED
@@ -42,40 +42,57 @@ This `README` was written by [Claude 2](https://www.anthropic.com/index/claude-2
42
  - `meta-llama/Llama-2-13b-chat-hf`
43
  - `meta-llama/Llama-2-70b-chat-hf`
44
  - Streaming output of assistant responses
45
- - Leverages LangChain for dialogue management
46
  - Integrates with [LangSmith](https://smith.langchain.com) for tracing conversations
47
  - Allows giving feedback on assistant's responses
 
 
 
 
 
 
 
 
 
 
 
 
 
48
 
49
- # Usage
50
  ## Run on HuggingFace Spaces
51
  [![Open HuggingFace Space](https://huggingface.co/datasets/huggingface/badges/raw/main/open-in-hf-spaces-sm.svg)](https://huggingface.co/spaces/joshuasundance/langchain-streamlit-demo)
52
 
53
  ## With Docker (pull from Docker Hub)
54
- 1. Run in terminal: `docker run -p 7860:7860 joshuasundance/langchain-streamlit-demo:latest`
55
- 2. Open http://localhost:7860 in your browser.
56
 
57
- ## Docker Compose
58
- 1. Clone the repo. Navigate to cloned repo directory.
59
- 2. Run in terminal: `docker compose up`
60
- 3. Then open http://localhost:7860 in your browser.
61
 
62
- # Configuration
63
- - Select a model from the dropdown
64
- - Enter an API key for the relevant provider
65
- - Optionally enter a LangSmith API key to enable conversation tracing
66
- - Customize the assistant prompt and temperature
67
 
68
- # Code Overview
69
- - `app.py` - Main Streamlit app definition
70
- - `llm_stuff.py` - LangChain helper functions
71
 
72
- # Deployment
73
- The app is packaged as a Docker image for easy deployment. It is published to Docker Hub and Hugging Face Spaces:
74
 
75
- - [DockerHub](https://hub.docker.com/r/joshuasundance/langchain-streamlit-demo)
76
- - [HuggingFace Spaces](https://huggingface.co/spaces/joshuasundance/langchain-streamlit-demo)
77
 
78
- CI workflows in `.github/workflows` handle building and publishing the image.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
79
 
80
  # Links
81
  - [Streamlit](https://streamlit.io)
@@ -84,6 +101,3 @@ CI workflows in `.github/workflows` handle building and publishing the image.
84
  - [OpenAI](https://openai.com/)
85
  - [Anthropic](https://www.anthropic.com/)
86
  - [Anyscale Endpoints](https://endpoints.anyscale.com/)
87
-
88
- # TODO
89
- 1. More customization / parameterization in sidebar
 
42
  - `meta-llama/Llama-2-13b-chat-hf`
43
  - `meta-llama/Llama-2-70b-chat-hf`
44
  - Streaming output of assistant responses
45
+ - Leverages LangChain for dialogue and memory management
46
  - Integrates with [LangSmith](https://smith.langchain.com) for tracing conversations
47
  - Allows giving feedback on assistant's responses
48
+ - Tries reading API keys and default values from environment variables
49
+ - Parameters in sidebar can be customized
50
+
51
+ # Code Overview
52
+ - `langchain-streamlit-demo/app.py` - Main Streamlit app definition
53
+ - `langchain-streamlit-demo/llm_stuff.py` - LangChain helper functions
54
+ - `Dockerfile`, `docker-compose.yml`: Docker deployment
55
+ - `kubernetes/`: Kubernetes deployment files
56
+ - `.github/workflows/`: CI/CD workflows
57
+
58
+ # Deployment
59
+ `langchain-streamlit-demo` is deployed as a [Docker image](https://hub.docker.com/r/joshuasundance/langchain-streamlit-demo) based on the [`python:3.11-slim-bookworm`](https://github.com/docker-library/python/blob/81b6e5f0643965618d633cd6b811bf0879dee360/3.11/slim-bookworm/Dockerfile) image.
60
+ CI/CD workflows in `.github/workflows` handle building and publishing the image as well as pushing it to Hugging Face.
61
 
 
62
  ## Run on HuggingFace Spaces
63
  [![Open HuggingFace Space](https://huggingface.co/datasets/huggingface/badges/raw/main/open-in-hf-spaces-sm.svg)](https://huggingface.co/spaces/joshuasundance/langchain-streamlit-demo)
64
 
65
  ## With Docker (pull from Docker Hub)
 
 
66
 
67
+ 1. _Optional_: Create a `.env` file based on `.env-example`
68
+ 2. Run in terminal:
 
 
69
 
70
+ `docker run -p 7860:7860 joshuasundance/langchain-streamlit-demo:latest`
 
 
 
 
71
 
72
+ or
 
 
73
 
74
+ `docker run -p 7860:7860 --env-file .env joshuasundance/langchain-streamlit-demo:latest`
 
75
 
76
+ 3. Open http://localhost:7860 in your browser
 
77
 
78
+ ## Docker Compose (build locally)
79
+ 1. Clone the repo. Navigate to cloned repo directory
80
+ 2. _Optional_: Create a `.env` file based on `.env-example`
81
+ 3. Run in terminal:
82
+
83
+ `docker compose up`
84
+
85
+ or
86
+
87
+ `docker compose up --env-file env`
88
+
89
+ 4. Open http://localhost:7860 in your browser
90
+
91
+ ## Kubernetes
92
+ 1. Clone the repo. Navigate to cloned repo directory
93
+ 2. Create a `.env` file based on `.env-example`
94
+ 3. Run in terminal: `cd kubernetes && kubectl apply -f resources.yaml`
95
+ 4. Get the IP address for your new service: `kubectl get service langchain-streamlit-demo`
96
 
97
  # Links
98
  - [Streamlit](https://streamlit.io)
 
101
  - [OpenAI](https://openai.com/)
102
  - [Anthropic](https://www.anthropic.com/)
103
  - [Anyscale Endpoints](https://endpoints.anyscale.com/)
 
 
 
docker-compose.yml CHANGED
@@ -4,8 +4,6 @@ services:
4
  langchain-streamlit-demo:
5
  image: langchain-streamlit-demo:latest
6
  build: .
7
- env_file:
8
- - .env
9
  ports:
10
  - "${APP_PORT:-7860}:${APP_PORT:-7860}"
11
  command: [
 
4
  langchain-streamlit-demo:
5
  image: langchain-streamlit-demo:latest
6
  build: .
 
 
7
  ports:
8
  - "${APP_PORT:-7860}:${APP_PORT:-7860}"
9
  command: [
kubernetes/deploy.sh ADDED
@@ -0,0 +1,19 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/bin/bash
2
+ set -euo pipefail
3
+ IFS=$'\n\t'
4
+
5
+ # Create a secret for environment variables
6
+ secretExists=$(kubectl get secret langchain-streamlit-demo-secret --ignore-not-found)
7
+
8
+ if [ -n "$secretExists" ]; then
9
+ echo "Secret 'langchain-streamlit-demo-secret' already exists. Deleting and recreating."
10
+ kubectl delete secret langchain-streamlit-demo-secret
11
+ else
12
+ echo "Secret 'langchain-streamlit-demo-secret' does not exist. Creating."
13
+ fi
14
+
15
+ kubectl create secret generic langchain-streamlit-demo-secret --from-env-file=../.env
16
+
17
+
18
+ # Deploy to Kubernetes
19
+ kubectl apply -f resources.yaml
kubernetes/resources.yaml ADDED
@@ -0,0 +1,84 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ apiVersion: apps/v1
2
+ kind: Deployment
3
+ metadata:
4
+ name: langchain-streamlit-demo-deployment
5
+ spec:
6
+ replicas: 1
7
+ selector:
8
+ matchLabels:
9
+ app: langchain-streamlit-demo
10
+ template:
11
+ metadata:
12
+ labels:
13
+ app: langchain-streamlit-demo
14
+ spec:
15
+ containers:
16
+ - name: langchain-streamlit-demo
17
+ image: joshuasundance/langchain-streamlit-demo:latest
18
+ imagePullPolicy: Always # get latest on restart
19
+ resources:
20
+ requests:
21
+ cpu: "100m"
22
+ memory: "200Mi"
23
+ limits:
24
+ cpu: "500m"
25
+ memory: "500Mi"
26
+ env:
27
+ - name: OPENAI_API_KEY
28
+ valueFrom:
29
+ secretKeyRef:
30
+ name: langchain-streamlit-demo-secret
31
+ key: OPENAI_API_KEY
32
+ - name: ANTHROPIC_API_KEY
33
+ valueFrom:
34
+ secretKeyRef:
35
+ name: langchain-streamlit-demo-secret
36
+ key: ANTHROPIC_API_KEY
37
+ - name: ANYSCALE_API_KEY
38
+ valueFrom:
39
+ secretKeyRef:
40
+ name: langchain-streamlit-demo-secret
41
+ key: ANYSCALE_API_KEY
42
+ - name: LANGCHAIN_API_KEY
43
+ valueFrom:
44
+ secretKeyRef:
45
+ name: langchain-streamlit-demo-secret
46
+ key: LANGCHAIN_API_KEY
47
+ - name: LANGCHAIN_PROJECT
48
+ value: "langchain-streamlit-demo"
49
+ securityContext:
50
+ runAsNonRoot: true
51
+ ---
52
+ apiVersion: v1
53
+ kind: Service
54
+ metadata:
55
+ name: langchain-streamlit-demo-service
56
+ # configure on Azure and uncomment below to use a vnet
57
+ # annotations:
58
+ # service.beta.kubernetes.io/azure-load-balancer-internal: "true"
59
+ # service.beta.kubernetes.io/azure-load-balancer-ipv4: vnet.ip.goes.here
60
+ # service.beta.kubernetes.io/azure-dns-label-name: "langchain-streamlit-demo"
61
+ spec:
62
+ selector:
63
+ app: langchain-streamlit-demo
64
+ ports:
65
+ - protocol: TCP
66
+ port: 80
67
+ targetPort: 7860
68
+ type: LoadBalancer
69
+ ---
70
+ apiVersion: networking.k8s.io/v1
71
+ kind: NetworkPolicy
72
+ metadata:
73
+ name: langchain-streamlit-demo-network-policy
74
+ spec:
75
+ podSelector:
76
+ matchLabels:
77
+ app: langchain-streamlit-demo
78
+ policyTypes:
79
+ - Ingress
80
+ ingress:
81
+ - from: [] # An empty array here means it will allow traffic from all sources.
82
+ ports:
83
+ - protocol: TCP
84
+ port: 7860