ill*_*ato 5 python streamlit langchain
我有一个流式聊天机器人,它工作得很好,但不记得以前的聊天历史记录。我试图用 langchain ConversationBufferMemory 添加它,但它似乎不起作用。
这是我创建的聊天机器人的示例:
import streamlit as st
from streamlit_chat import message
from langchain.chains import ConversationChain
from langchain.llms import OpenAI
from langchain.chat_models import AzureChatOpenAI
from langchain.memory import ConversationBufferMemory
from langchain.prompts import (
ChatPromptTemplate,
MessagesPlaceholder,
SystemMessagePromptTemplate,
HumanMessagePromptTemplate
)
prompt = ChatPromptTemplate.from_messages([
SystemMessagePromptTemplate.from_template("The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know."),
MessagesPlaceholder(variable_name="history"),
HumanMessagePromptTemplate.from_template("{input}")
])
def load_chain(prompt):
"""Logic for loading the chain you want to use should go here."""
llm = AzureChatOpenAI(
deployment_name = 'gpt-35-turbo',
model_name = 'gpt-35-turbo',
temperature = 0,
openai_api_key = '.....',
openai_api_base = '.....',
openai_api_version = "2023-05-15",
openai_api_type="azure"
)
memory = ConversationBufferMemory(return_messages=True)
chain = ConversationChain(
llm=llm,
verbose=True,
prompt=prompt,
memory=memory
)
return chain
chain = load_chain(prompt)
# From here down is all the StreamLit UI.
st.set_page_config(page_title="LangChain Demo", page_icon=":robot:")
st.header("LangChain Demo")
if "generated" not in st.session_state:
st.session_state["generated"] = []
if "past" not in st.session_state:
st.session_state["past"] = []
if "history" not in st.session_state:
st.session_state["history"] = []
def get_text():
input_text = st.text_input("You: ", "Hello, how are you?", key="input")
return input_text
user_input = get_text()
if user_input:
output = chain.run(input=user_input, history=st.session_state["history"])
st.session_state["history"].append((user_input, output))
st.session_state.past.append(user_input)
st.session_state.generated.append(output)
st.write(st.session_state["history"])
if st.session_state["generated"]:
for i in range(len(st.session_state["generated"]) - 1, -1, -1):
message(st.session_state["generated"][i], key=str(i))
message(st.session_state["past"][i], is_user=True, key=str(i) + "_user")
Run Code Online (Sandbox Code Playgroud)
看起来机器人出于某种原因忽略了 ConversationBufferMemory。任何帮助,将不胜感激。
小智 1
@hiper2d 的回答已经说得很清楚了。在过去的一个小时里我面临着同样的问题。事后看来,我应该知道。
每当屏幕上需要更新某些内容时,Streamlit 都会从上到下重新运行整个 Python 脚本。这意味着,ConversationalBufferMemory每次刷新都会启动。如果我们要将其存储在会话状态中,我看不到内存抽象的使用。
但是,如果您确实想这样做,您可以设置会话状态:
if "messages" not in st.session_state:
st.session_state.messages = []
# While processing
st.session_state.messages.append({"role" : "assistant", "content" : output_from_llm})
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
2106 次 |
| 最近记录: |