Spaces:
Sleeping
Sleeping
import streamlit as st | |
import os | |
from groq import Groq | |
from dotenv import load_dotenv | |
load_dotenv() | |
st.title("AI-powered Mindfulness App") | |
st.write(""" | |
The application is designed to assist you in making informed decisions by providing a structured platform for consultation. | |
Embedded with Panca Sradha values, this application will ensure that all guidance aligns with the core principles upheld by Kalbe Group, promoting consistency and integrity in decision-making. | |
You can input the background context of the decision you are considering and articulate the specific questions for which you need guidance. | |
The application will generate two outputs: a detailed recommendation and a corresponding risk assessment with suggested mitigation strategies. | |
""") | |
# Input text boxes | |
input1 = st.text_area("Background", height=200) | |
input2 = st.text_area("Question", height=50) | |
client = Groq( | |
api_key=os.environ['GROQ_API_KEY'], | |
) | |
prompt = """ | |
You are a company consultant in Kalbe Group with a deep understanding of the company's core values, known as Panca Sradha. | |
Your task is to provide strategic recommendations and mitigated risks based on the following inputs: | |
Background: {} | |
Question: {} | |
IMPORTANT: Your recommendations must align with the Panca Sradha values: | |
1. Trust is the glue of life. | |
2. Mindfulness is the foundation of our action. | |
3. Innovation is the key to our success. | |
4. Strive to be the best. | |
5. Interconnectedness is a universal way of life. | |
Based on these inputs, your task is to provide two outputs: | |
1. Recommendation: A detailed suggestion that aligns with Kalbe Group's strategic objectives. | |
2. Risk and Mitigation: An assessment of potential risks associated with the decision and corresponding strategies to mitigate these risks. | |
Ensure your response is connected to the Panca Sradha values by emphasizing them in italics whenever mentioned. | |
""" | |
def generate_answer(background, question): | |
chat_completion = client.chat.completions.create( | |
messages=[ | |
{ | |
"role": "user", | |
"content": prompt.format(background, question), | |
} | |
], | |
model="llama-3.1-70b-versatile", | |
) | |
tasks = chat_completion.choices[0].message.content.strip() | |
return tasks | |
if st.button("Submit"): | |
result = generate_answer(input1, input2) | |
st.write(result) | |