%pip install llama-index-llms-premai
from llama_index.llms.premai import PremAI
from llama_index.core.llms import ChatMessage
import os
import getpass
if os.environ.get("PREMAI_API_KEY") is None:
os.environ["PREMAI_API_KEY"] = getpass.getpass("PremAI API Key:")
prem_chat = PremAI(project_id=8)
聊天补全功能¶
现在一切准备就绪。我们可以开始与应用程序进行交互了。首先让我们使用llama-index构建简单的聊天请求和响应。
messages = [
ChatMessage(role="user", content="What is your name"),
ChatMessage(
role="user", content="Write an essay about your school in 500 words"
),
]
请注意:您可以通过 ChatMessage 提供系统提示,如下所示:
messages = [
ChatMessage(role="system", content="Act like a pirate"),
ChatMessage(role="user", content="What is your name"),
ChatMessage(role="user", content="Where do you live, write an essay in 500 words"),
]
此外,您也可以在初始化客户端时设置系统提示:
chat = PremAI(project_id=8, system_prompt="Act like nemo fish")
在这两种情况下,您都将覆盖从平台部署应用时设置的固定系统提示。特别需要注意的是,如果您在实例化 PremAI 类时覆盖了系统提示,那么
ChatMessage中的系统消息将不会产生任何效果。
因此,若您想为实验性用例覆盖系统提示,要么在实例化客户端时提供,要么在
ChatMessage中以system角色写入。
现在让我们调用模型
response = prem_chat.chat(messages)
print(response)
[ChatResponse(message=ChatMessage(role=<MessageRole.ASSISTANT: 'assistant'>, content="I'm here to assist you with any questions or tasks you have, but I'm not able to write essays. However, if you need help brainstorming ideas or organizing your thoughts for your essay about your school, I'd be happy to help with that. Just let me know how I can assist you further!", additional_kwargs={}), raw={'role': <RoleEnum.ASSISTANT: 'assistant'>, 'content': "I'm here to assist you with any questions or tasks you have, but I'm not able to write essays. However, if you need help brainstorming ideas or organizing your thoughts for your essay about your school, I'd be happy to help with that. Just let me know how I can assist you further!"}, delta=None, additional_kwargs={})]
你也可以将聊天功能转换为补全功能。具体操作如下:
completion = prem_chat.complete("Paul Graham is ")
query = "what is the diameter of individual Galaxy"
repository_ids = [
1991,
]
repositories = dict(ids=repository_ids, similarity_threshold=0.3, limit=3)
首先我们通过定义一些仓库ID来初始化仓库。请确保这些ID是有效的仓库标识符。您可以通过此链接了解更多关于获取仓库ID的信息。
请注意:与调用
model_name参数时类似,当您传入repositories参数时,可能会覆盖启动台中已连接的仓库配置。
接下来,我们将仓库与聊天对象连接,以启用基于RAG的生成功能。
messages = [
ChatMessage(role="user", content=query),
]
response = prem_chat.chat(messages, repositories=repositories)
print(response)
因此,这也意味着在使用 Prem 平台时,您无需自行构建 RAG 流程。Prem 采用自主研发的 RAG 技术,为检索增强生成提供业界领先的性能表现。
理想情况下,您无需在此关联代码库 ID 即可获得检索增强生成效果。只要已在 prem 平台中关联过代码库,您仍可获得相同结果。
Prem 模板系统¶
编写提示词模板可能会变得非常混乱。提示模板通常冗长、难以管理,并且需要持续调整以优化效果,同时还要确保在整个应用程序中保持一致。
通过 Prem 平台,编写和管理提示词将变得异常简单。启动台中的 模板 标签页可帮助您创建任意数量的提示模板,并通过 SDK 在应用程序中调用这些模板运行。您可以在此文档了解更多关于提示模板的信息。
要在 llama-index 中原生使用 Prem 模板,您需要传递一个字典对象,其中包含以 id 为键的键值对,值则为模板变量。这个键值对应关系应通过 ChatMessage 的 additional_kwargs 参数传递。
举例说明,假设您的提示模板如下:
Say hello to my name and say a feel-good quote
from my age. My name is: {name} and age is {age}
那么您的消息结构应如下所示:
messages = [
ChatMessage(
role="user", content="Shawn", additional_kwargs={"id": "name"}
),
ChatMessage(role="user", content="22", additional_kwargs={"id": "age"}),
]
将这段 messages 传递给 llama-index 的 PremAI 客户端。请注意:务必同时传入 template_id 参数以使用 Prem 模板进行生成。若您不了解 template_id 的用法,可查阅我们的文档获取更多信息。示例如下:
template_id = "78069ce8-xxxxx-xxxxx-xxxx-xxx"
response = prem_chat.chat(messages, template_id=template_id)
Prem 模板同样支持流式传输。
流式传输¶
在本节中,我们将了解如何通过llama-index和PremAI实现令牌流式传输。其实现方式与上文所述方法非常相似。具体操作如下:
streamed_response = prem_chat.stream_chat(messages)
for response_delta in streamed_response:
print(response_delta.delta, end="")
I'm here to assist you with writing tasks, but I don't have personal experiences or attend school. However, I can help you brainstorm ideas, outline your essay, or provide information on various school-related topics. Just let me know how I can assist you further!
这将逐个流式传输令牌。与 complete 方法类似,我们提供了 stream_complete 方法用于实现补全过程的令牌流式传输。
# This will stream tokens one by one
streamed_response = prem_chat.stream_complete("hello how are you")
for response_delta in streamed_response:
print(response_delta.delta, end="")
Hello! I'm here and ready to assist you. How can I help you today?