Joshua Sundance Bailey commited on
Commit
9578b22
1 Parent(s): b281bf3

env file use

Browse files
Files changed (3) hide show
  1. README.md +39 -11
  2. docker-compose.yml +0 -2
  3. kubernetes/deploy.sh +1 -1
README.md CHANGED
@@ -42,32 +42,60 @@ This `README` was written by [Claude 2](https://www.anthropic.com/index/claude-2
42
  - `meta-llama/Llama-2-13b-chat-hf`
43
  - `meta-llama/Llama-2-70b-chat-hf`
44
  - Streaming output of assistant responses
45
- - Leverages LangChain for dialogue management
46
  - Integrates with [LangSmith](https://smith.langchain.com) for tracing conversations
47
  - Allows giving feedback on assistant's responses
 
48
 
49
  # Usage
50
  ## Run on HuggingFace Spaces
51
  [![Open HuggingFace Space](https://huggingface.co/datasets/huggingface/badges/raw/main/open-in-hf-spaces-sm.svg)](https://huggingface.co/spaces/joshuasundance/langchain-streamlit-demo)
52
 
53
  ## With Docker (pull from Docker Hub)
54
- 1. Run in terminal: `docker run -p 7860:7860 joshuasundance/langchain-streamlit-demo:latest`
55
- 2. Open http://localhost:7860 in your browser.
 
 
 
 
 
 
 
 
56
 
57
  ## Docker Compose
58
- 1. Clone the repo. Navigate to cloned repo directory.
59
- 2. Run in terminal: `docker compose up`
60
- 3. Then open http://localhost:7860 in your browser.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
61
 
62
  # Configuration
63
  - Select a model from the dropdown
64
- - Enter an API key for the relevant provider
65
- - Optionally enter a LangSmith API key to enable conversation tracing
 
66
  - Customize the assistant prompt and temperature
67
 
68
  # Code Overview
69
- - `app.py` - Main Streamlit app definition
70
- - `llm_stuff.py` - LangChain helper functions
 
 
 
71
 
72
  # Deployment
73
  The app is packaged as a Docker image for easy deployment. It is published to Docker Hub and Hugging Face Spaces:
@@ -75,7 +103,7 @@ The app is packaged as a Docker image for easy deployment. It is published to Do
75
  - [DockerHub](https://hub.docker.com/r/joshuasundance/langchain-streamlit-demo)
76
  - [HuggingFace Spaces](https://huggingface.co/spaces/joshuasundance/langchain-streamlit-demo)
77
 
78
- CI workflows in `.github/workflows` handle building and publishing the image.
79
 
80
  # Links
81
  - [Streamlit](https://streamlit.io)
 
42
  - `meta-llama/Llama-2-13b-chat-hf`
43
  - `meta-llama/Llama-2-70b-chat-hf`
44
  - Streaming output of assistant responses
45
+ - Leverages LangChain for dialogue and memory management
46
  - Integrates with [LangSmith](https://smith.langchain.com) for tracing conversations
47
  - Allows giving feedback on assistant's responses
48
+ - Tries reading API keys and default values from environment variables
49
 
50
  # Usage
51
  ## Run on HuggingFace Spaces
52
  [![Open HuggingFace Space](https://huggingface.co/datasets/huggingface/badges/raw/main/open-in-hf-spaces-sm.svg)](https://huggingface.co/spaces/joshuasundance/langchain-streamlit-demo)
53
 
54
  ## With Docker (pull from Docker Hub)
55
+ 1. _Optional_: Create a `.env` file based on `.env-example`
56
+ 2. Run in terminal:
57
+
58
+ `docker run -p 7860:7860 joshuasundance/langchain-streamlit-demo:latest`
59
+
60
+ or
61
+
62
+ `docker run -p 7860:7860 --env-file .env joshuasundance/langchain-streamlit-demo:latest`
63
+
64
+ 5. Open http://localhost:7860 in your browser
65
 
66
  ## Docker Compose
67
+ 1. Clone the repo. Navigate to cloned repo directory
68
+ 2. _Optional_: Create a `.env` file based on `.env-example`
69
+ 3. Run in terminal:
70
+
71
+ `docker compose up`
72
+
73
+ or
74
+
75
+ `docker compose up --env-file env`
76
+
77
+ 4. Run in terminal: `docker compose up`
78
+ 5. Open http://localhost:7860 in your browser
79
+
80
+ ## Kubernetes
81
+ 1. Clone the repo. Navigate to cloned repo directory
82
+ 2. Create a `.env` file based on `.env-example`
83
+ 3. Run in terminal: `cd kubernetes && kubectl apply -f resources.yaml`
84
+ 4. Get the IP address for your new service: `kubectl get service langchain-streamlit-demo`
85
 
86
  # Configuration
87
  - Select a model from the dropdown
88
+ - _Optional_: Create a `.env` file based on `.env-example`, or
89
+ - Enter an API key for the relevant provider
90
+ - Optionally enter a LangSmith API key to enable conversation tracing
91
  - Customize the assistant prompt and temperature
92
 
93
  # Code Overview
94
+ - `langchain-streamlit-demo/app.py` - Main Streamlit app definition
95
+ - `langchain-streamlit-demo/llm_stuff.py` - LangChain helper functions
96
+ - `Dockerfile`, `docker-compose.yml`: Docker deployment
97
+ - `kubernetes/`: Kubernetes deployment files
98
+ - `.github/workflows/`: CI/CD workflows
99
 
100
  # Deployment
101
  The app is packaged as a Docker image for easy deployment. It is published to Docker Hub and Hugging Face Spaces:
 
103
  - [DockerHub](https://hub.docker.com/r/joshuasundance/langchain-streamlit-demo)
104
  - [HuggingFace Spaces](https://huggingface.co/spaces/joshuasundance/langchain-streamlit-demo)
105
 
106
+ CI/CD workflows in `.github/workflows` handle building and publishing the image.
107
 
108
  # Links
109
  - [Streamlit](https://streamlit.io)
docker-compose.yml CHANGED
@@ -4,8 +4,6 @@ services:
4
  langchain-streamlit-demo:
5
  image: langchain-streamlit-demo:latest
6
  build: .
7
- env_file:
8
- - .env
9
  ports:
10
  - "${APP_PORT:-7860}:${APP_PORT:-7860}"
11
  command: [
 
4
  langchain-streamlit-demo:
5
  image: langchain-streamlit-demo:latest
6
  build: .
 
 
7
  ports:
8
  - "${APP_PORT:-7860}:${APP_PORT:-7860}"
9
  command: [
kubernetes/deploy.sh CHANGED
@@ -12,7 +12,7 @@ else
12
  echo "Secret 'langchain-streamlit-demo-secret' does not exist. Creating."
13
  fi
14
 
15
- kubectl create secret generic langchain-streamlit-demo-secret --from-env-file=.env
16
 
17
 
18
  # Deploy to Kubernetes
 
12
  echo "Secret 'langchain-streamlit-demo-secret' does not exist. Creating."
13
  fi
14
 
15
+ kubectl create secret generic langchain-streamlit-demo-secret --from-env-file=../.env
16
 
17
 
18
  # Deploy to Kubernetes