聊天引擎 - ReAct 代理模式¶
ReAct 是一种基于智能代理的聊天模式,构建在您数据之上的查询引擎。
对于每次聊天交互,智能体都会进入一个 ReAct 循环:
- 首先决定是否使用查询引擎工具并生成合适的输入
- (可选)使用查询引擎工具并观察其输出
- 决定是否重复操作或给出最终响应
这种方法非常灵活,因为它可以自主选择是否查询知识库。
然而,其表现也更依赖于大语言模型(LLM)的质量。
您可能需要施加更多引导,以确保模型在适当的时候选择查询知识库,而非凭空编造答案。
如果您在 Colab 上打开此 Notebook,可能需要安装 LlamaIndex 🦙。
In [ ]:
Copied!
%pip install llama-index-llms-anthropic
%pip install llama-index-llms-openai
%pip install llama-index-llms-anthropic
%pip install llama-index-llms-openai
In [ ]:
Copied!
!pip install llama-index
!pip install llama-index
下载数据¶
In [ ]:
Copied!
!mkdir -p 'data/paul_graham/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'
!mkdir -p 'data/paul_graham/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'
5 行代码快速上手¶
加载数据并构建索引
In [ ]:
Copied!
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
from llama_index.llms.openai import OpenAI
from llama_index.llms.anthropic import Anthropic
llm = OpenAI()
data = SimpleDirectoryReader(input_dir="./data/paul_graham/").load_data()
index = VectorStoreIndex.from_documents(data)
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
from llama_index.llms.openai import OpenAI
from llama_index.llms.anthropic import Anthropic
llm = OpenAI()
data = SimpleDirectoryReader(input_dir="./data/paul_graham/").load_data()
index = VectorStoreIndex.from_documents(data)
配置聊天引擎
In [ ]:
Copied!
chat_engine = index.as_chat_engine(chat_mode="react", llm=llm, verbose=True)
chat_engine = index.as_chat_engine(chat_mode="react", llm=llm, verbose=True)
与你的数据对话
In [ ]:
Copied!
response = chat_engine.chat(
"Use the tool to answer what did Paul Graham do in the summer of 1995?"
)
response = chat_engine.chat(
"Use the tool to answer what did Paul Graham do in the summer of 1995?"
)
Thought: I need to use a tool to help me answer the question. Action: query_engine_tool Action Input: {'input': 'What did Paul Graham do in the summer of 1995?'} Observation: In the summer of 1995, Paul Graham worked on building a web application for making web applications. He recruited Dan Giffin, who had worked for Viaweb, and two undergrads who wanted summer jobs, and they got to work trying to build what it's now clear is about twenty companies and several open source projects worth of software. The language for defining applications would of course be a dialect of Lisp. Response: In the summer of 1995, Paul Graham worked on building a web application for making web applications. He recruited Dan Giffin, who had worked for Viaweb, and two undergrads who wanted summer jobs, and they got to work trying to build what it's now clear is about twenty companies and several open source projects worth of software. The language for defining applications would of course be a dialect of Lisp.
In [ ]:
Copied!
print(response)
print(response)
In the summer of 1995, Paul Graham worked on building a web application for making web applications. He recruited Dan Giffin, who had worked for Viaweb, and two undergrads who wanted summer jobs, and they got to work trying to build what it's now clear is about twenty companies and several open source projects worth of software. The language for defining applications would of course be a dialect of Lisp.
自定义大语言模型¶
使用 Anthropic("claude-2")
In [ ]:
Copied!
llm = Anthropic()
llm = Anthropic()
配置聊天引擎
In [ ]:
Copied!
chat_engine = index.as_chat_engine(llm=llm, chat_mode="react", verbose=True)
chat_engine = index.as_chat_engine(llm=llm, chat_mode="react", verbose=True)
In [ ]:
Copied!
response = chat_engine.chat("what did Paul Graham do in the summer of 1995?")
response = chat_engine.chat("what did Paul Graham do in the summer of 1995?")
Thought: I need to use a tool to help me answer the question. Action: query_engine_tool Action Input: {'input': 'what did Paul Graham do in the summer of 1995?'} Observation: Based on the context, in the summer of 1995 Paul Graham: - Painted a second still life using the same objects he had used for a previous still life painting. - Looked for an apartment to buy in New York, trying to find a neighborhood similar to Cambridge, MA. - Realized there wasn't really a "Cambridge of New York" after visiting the actual Cambridge. The passage does not mention what Paul Graham did in the summer of 1995 specifically. It talks about painting a second still life at some point and looking for an apartment in New York at some point, but it does not connect those events to the summer of 1995. Response: The passage does not provide enough information to know specifically what Paul Graham did in the summer of 1995. It mentions some activities like painting and looking for an apartment in New York, but does not say these occurred specifically in the summer of 1995.
In [ ]:
Copied!
print(response)
print(response)
The passage does not provide enough information to know specifically what Paul Graham did in the summer of 1995. It mentions some activities like painting and looking for an apartment in New York, but does not say these occurred specifically in the summer of 1995.
In [ ]:
Copied!
response = chat_engine.chat("What did I ask you before?")
response = chat_engine.chat("What did I ask you before?")
Response: You asked me "what did Paul Graham do in the summer of 1995?".
In [ ]:
Copied!
print(response)
print(response)
You asked me "what did Paul Graham do in the summer of 1995?".
重置聊天引擎
In [ ]:
Copied!
chat_engine.reset()
chat_engine.reset()
In [ ]:
Copied!
response = chat_engine.chat("What did I ask you before?")
response = chat_engine.chat("What did I ask you before?")
Response: I'm afraid I don't have any context about previous questions in our conversation. This seems to be the start of a new conversation between us.
In [ ]:
Copied!
print(response)
print(response)
I'm afraid I don't have any context about previous questions in our conversation. This seems to be the start of a new conversation between us.