如何在 LangChain 中结合 ConversationalRetrievalQAChain、代理和工具

Joh*_*rug 9 agent openai-api langchain langchain-js

我想结合ConversationalRetrievalQAChain- 例如 - SerpAPILangChain 中的工具。

我用来ConversationalRetrievalQAChain搜索使用 OpenAI 嵌入 API 和本地 Chroma 矢量数据库摄取的产品 PDF。这很好用。但是,产品 PDF 没有最新的定价信息。所以当用户询问定价信息时,我希望LangChain使用该SerpAPI工具来谷歌搜索价格。我有两个部分分开工作,但我很想将它们结合起来。

这是文档搜索部分(请记住:这是 PoC 质量的代码):

// Prompt used to rephrase/condose the question
const CONDENSE_PROMPT = `Given the following conversation and a follow up question, rephrase the follow up question to be a standalone question.

Chat History:
{chat_history}

Follow Up Input: {question}

Standalone question:`;

// Prompt for the actual question
const QA_PROMPT = `You are a helpful AI assistant for sales reps to answer questions about product features and technicals specifications.
Use the following pieces of context to answer the question at the end.
If you don't know the answer, just say you don't know. DO NOT try to make up an answer.
If the question is not related to the context, politely respond that you are tuned to only answer questions that are related to the context.

{context}

Question: {question}
Helpful answer:`;

export const POST: RequestHandler = async ({ request }) => {
  const { messages } = await request.json();
  const { stream, handlers } = LangChainStream();

  const openAIApiKey = OPENAI_API_KEY;
  const embeddings = new OpenAIEmbeddings({ openAIApiKey });

  // This model is used to answer the actual question
  const model = new ChatOpenAI({
    openAIApiKey,
    temperature: 0,
    streaming: true,
  });

  // This model is used to rephrase the question based on the chat history
  const nonStreamingModel = new ChatOpenAI({
    openAIApiKey,
    temperature: 0,
  });

  const store = await Chroma.fromExistingCollection(embeddings, {
    collectionName: 'langchain',
  });

  const chain = ConversationalRetrievalQAChain.fromLLM(
    model,
    store.asRetriever(),
    {
      returnSourceDocuments: true,
      verbose: false,
      qaChainOptions: {
        type: "stuff",
        prompt: PromptTemplate.fromTemplate(QA_PROMPT)
      },
      questionGeneratorChainOptions: {
        template: CONDENSE_PROMPT,
        llm: nonStreamingModel,
      },
    }
  );

  const callbacks = CallbackManager.fromHandlers(handlers);
  const latest = (messages as Message[]).at(-1)!.content;

  chain.call({ question: latest, chat_history: (messages as Message[]).map((m) => `${m.role}: ${m.content}`).join('\n') }, callbacks).catch(console.error);

  return new StreamingTextResponse(stream);
};
Run Code Online (Sandbox Code Playgroud)

这是SerpAPI工具代码:

export const POST: RequestHandler = async ({ request }) => {
  const { messages } = await request.json();
  const { stream, handlers } = LangChainStream();

  const openAIApiKey = OPENAI_API_KEY;

  // This model is used to answer the actual question
  const model = new ChatOpenAI({
    openAIApiKey,
    temperature: 0,
    streaming: true,
  });

  // Define the list of tools the agent can use
  const tools = [
    new SerpAPI(SERPAPI_API_KEY, {
      location: "Austin,Texas,United States",
      hl: "en",
      gl: "us",
    }),
  ];

  // Create the agent from the chat model and the tools
  const agent = ChatAgent.fromLLMAndTools(model, tools);

  // Create an executor, which calls to the agent until an answer is found
  const executor = AgentExecutor.fromAgentAndTools({ agent, tools });

  const callbacks = CallbackManager.fromHandlers(handlers);
  const latest = (messages as Message[]).at(-1)!.content;

  executor.call({ input: latest }, callbacks).catch(console.error);

  return new StreamingTextResponse(stream);
};
Run Code Online (Sandbox Code Playgroud)

就他们自己而言,他们工作得很好。我如何将它们结合起来?

这是我想象的工作流程:

  1. 用户的问题来了。
  2. LangChain 决定是否是需要互联网搜索的问题。
  3. 如果是,请使用 SerpAPI 工具进行搜索并响应。
  4. 如果不需要互联网搜索,请从向量数据库中检索相似的块,构建提示并询问 OpenAI。
  5. 任一路径都应出现在聊天历史记录中,并且可能在任一情况下都应利用聊天历史记录,以便用户可以询问“成本是多少?” 根据聊天记录可以知道隐含的产品。

那可能吗?谢谢你!

Joh*_*rug 0

我的好朋友贾斯汀为我指明了正确的方向。浪链有“检索代理”。这个想法是基于向量数据库的检索器只是法学硕士可用的另一个工具。因此,在我的示例中,您将拥有一个用于检索相关数据的“工具”和另一个用于执行互联网搜索的“工具”。根据用户的输入,法学硕士将选择使用哪个工具。

更多详细信息和示例代码请参见:https://js.langchain.com/docs/use_cases/question_answering/conversational_retrieval_agents