Spaces:
Runtime error
Runtime error
Updated some of the text
Browse files
Home.py
CHANGED
@@ -4,9 +4,9 @@ from src.st_helpers import st_setup
|
|
4 |
if st_setup("LLM Architectures"):
|
5 |
st.write("""
|
6 |
# LLM Architecture Assessment
|
7 |
-
This
|
8 |
|
9 |
-
The goal of the project is to
|
10 |
|
11 |
All the source code for this application and the associated tooling and data can be found the [project GitHub repo on Hugging Face](https://huggingface.co/spaces/alfraser/llm-arch/tree/main).
|
12 |
|
@@ -19,4 +19,5 @@ if st_setup("LLM Architectures"):
|
|
19 |
|
20 |
## Credits
|
21 |
- This project predominantly uses [LLama 2](https://ai.meta.com/llama/) and derivative models for language inference. Models are made available under the [Meta Llama license](https://ai.meta.com/llama/license/).
|
|
|
22 |
""")
|
|
|
4 |
if st_setup("LLM Architectures"):
|
5 |
st.write("""
|
6 |
# LLM Architecture Assessment
|
7 |
+
This application is an interactive element of the LLM Architecture Assessment project prepared by [Alisdair Fraser](http://www.linkedin.com/alisdairfraser) (alisdairfraser (at) gmail (dot) com), in submission for the final research project for the [Online MSc in Artificial Intelligence](https://info.online.bath.ac.uk/msai/) with the University of Bath. This application allows users to browse a synthetic set of "pruvate data" and to interact with systems built to represent different architectural prototypes.
|
8 |
|
9 |
+
The goal of the project is to make an assessment of the architectural patterns for deploying LLMs in conjunction with private data stores. The target audience is IT management, with a goal of providing key considerations for why one might choose a particular architecture or another.
|
10 |
|
11 |
All the source code for this application and the associated tooling and data can be found the [project GitHub repo on Hugging Face](https://huggingface.co/spaces/alfraser/llm-arch/tree/main).
|
12 |
|
|
|
19 |
|
20 |
## Credits
|
21 |
- This project predominantly uses [LLama 2](https://ai.meta.com/llama/) and derivative models for language inference. Models are made available under the [Meta Llama license](https://ai.meta.com/llama/license/).
|
22 |
+
- This project is built on [streamlit](https://streamlit.io).
|
23 |
""")
|