Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add chatbot langchain recipe #33

Merged
merged 1 commit into from
Feb 2, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 15 additions & 0 deletions chatbot-langchain/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
# Streamlit + Langchain ChatBot Demo

### Build image
```bash
cd chatbot-langchain
podman build -t stchat . -f builds/Containerfile
```
### Run image locally

Make sure your model service is up and running before starting this container image.


```bash
podman run -it -p 8501:8501 -e MODEL_SERVICE_ENDPOINT=http://10.88.0.1:8001/v1 stchat
```
20 changes: 20 additions & 0 deletions chatbot-langchain/ai-studio.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
application:
type: language
name: ChatBot_Streamlit
description: This is a Streamlit chat demo application.
containers:
- name: llamacpp-server
contextdir: ../playground
containerfile: Containerfile
model-service: true
backend:
- llama
arch:
- arm64
- amd64
- name: streamlit-chat-app
contextdir: .
containerfile: builds/Containerfile
arch:
- arm64
- amd64
8 changes: 8 additions & 0 deletions chatbot-langchain/builds/Containerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
FROM registry.access.redhat.com/ubi9/python-39:latest
WORKDIR /chat
COPY builds/requirements.txt .
RUN pip install --upgrade pip
RUN pip install --no-cache-dir --upgrade -r /chat/requirements.txt
COPY chatbot_ui.py .
EXPOSE 8501
ENTRYPOINT [ "streamlit", "run", "chatbot_ui.py" ]
3 changes: 3 additions & 0 deletions chatbot-langchain/builds/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
langchain
langchain_openai
streamlit
39 changes: 39 additions & 0 deletions chatbot-langchain/chatbot_ui.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
from langchain_openai import ChatOpenAI
from langchain.chains import LLMChain
from langchain_community.callbacks import StreamlitCallbackHandler
from langchain_core.prompts import ChatPromptTemplate
import streamlit as st
import os


model_service = os.getenv("MODEL_SERVICE_ENDPOINT",
"http://localhost:8001/v1")

st.title("💬 Chatbot")
if "messages" not in st.session_state:
st.session_state["messages"] = [{"role": "assistant",
"content": "How can I help you?"}]

for msg in st.session_state.messages:
st.chat_message(msg["role"]).write(msg["content"])

llm = ChatOpenAI(base_url=model_service,
api_key="sk-no-key-required",
streaming=True,
callbacks=[StreamlitCallbackHandler(st.container(),
expand_new_thoughts=True,
collapse_completed_thoughts=True)])
prompt = ChatPromptTemplate.from_messages([
("system", "You are world class technical advisor."),
("user", "{input}")
])

chain = LLMChain(llm=llm, prompt=prompt)

if prompt := st.chat_input():
st.session_state.messages.append({"role": "user", "content": prompt})
st.chat_message("user").markdown(prompt)
response = chain.invoke(prompt)
st.chat_message("assistant").markdown(response["text"])
st.session_state.messages.append({"role": "assistant", "content": response["text"]})
st.rerun()