Skip to content

聊天存储库#

聊天存储库作为集中式接口用于存储聊天历史记录。与其他存储格式相比,聊天历史的独特之处在于消息顺序对于维持完整对话至关重要。

聊天存储库可以通过键(如user_ids或其他唯一可识别字符串)组织聊天消息序列,并处理deleteinsertget操作。

SimpleChatStore#

最基础的聊天存储库是SimpleChatStore,它将消息存储在内存中,可以保存到磁盘/从磁盘加载,也可以序列化后存储在其他地方。

通常,您会实例化一个聊天存储库并将其提供给内存模块。使用聊天存储库的内存模块在未提供时会默认使用SimpleChatStore

from llama_index.core.storage.chat_store import SimpleChatStore
from llama_index.core.memory import ChatMemoryBuffer

chat_store = SimpleChatStore()

chat_memory = ChatMemoryBuffer.from_defaults(
    token_limit=3000,
    chat_store=chat_store,
    chat_store_key="user1",
)

创建内存后,您可以将其包含在代理或聊天引擎中:

agent = OpenAIAgent.from_tools(tools, memory=memory)
# 或
chat_engine = index.as_chat_engine(memory=memory)

要保存聊天存储库供以后使用,您可以选择从磁盘保存/加载:

chat_store.persist(persist_path="chat_store.json")
loaded_chat_store = SimpleChatStore.from_persist_path(
    persist_path="chat_store.json"
)

或者转换为字符串/从字符串转换,在此过程中将字符串保存在其他地方:

chat_store_string = chat_store.json()
loaded_chat_store = SimpleChatStore.parse_raw(chat_store_string)

UpstashChatStore#

使用UpstashChatStore,您可以通过Upstash Redis远程存储聊天历史记录,它提供无服务器Redis解决方案,非常适合需要可扩展且高效聊天存储的应用程序。 此聊天存储库支持同步和异步操作。

安装#

pip install llama-index-storage-chat-store-upstash

使用#

from llama_index.storage.chat_store.upstash import UpstashChatStore
from llama_index.core.memory import ChatMemoryBuffer

chat_store = UpstashChatStore(
    redis_url="YOUR_UPSTASH_REDIS_URL",
    redis_token="YOUR_UPSTASH_REDIS_TOKEN",
    ttl=300,  # 可选:生存时间(秒)
)

chat_memory = ChatMemoryBuffer.from_defaults(
    token_limit=3000,
    chat_store=chat_store,
    chat_store_key="user1",
)

UpstashChatStore支持同步和异步操作。以下是使用异步方法的示例:

import asyncio
from llama_index.core.llms import ChatMessage


async def main():
    # 添加消息
    messages = [
        ChatMessage(content="Hello", role="user"),
        ChatMessage(content="Hi there!", role="assistant"),
    ]
    await chat_store.async_set_messages("conversation1", messages)

    # 检索消息
    retrieved_messages = await chat_store.async_get_messages("conversation1")
    print(retrieved_messages)

    # 删除最后一条消息
    deleted_message = await chat_store.async_delete_last_message(
        "conversation1"
    )
    print(f"Deleted message: {deleted_message}")


asyncio.run(main())

RedisChatStore#

使用RedisChatStore,您可以远程存储聊天历史记录,无需担心手动持久化和加载聊天历史。

from llama_index.storage.chat_store.redis import RedisChatStore
from llama_index.core.memory import ChatMemoryBuffer

chat_store = RedisChatStore(redis_url="redis://localhost:6379", ttl=300)

chat_memory = ChatMemoryBuffer.from_defaults(
    token_limit=3000,
    chat_store=chat_store,
    chat_store_key="user1",
)

AzureChatStore#

使用AzureChatStore,您可以在Azure表存储或CosmosDB中远程存储聊天历史记录,无需担心手动持久化和加载聊天历史。

pip install llama-index
pip install llama-index-llms-azure-openai
pip install llama-index-storage-chat-store-azure
from llama_index.core.chat_engine import SimpleChatEngine
from llama_index.core.memory import ChatMemoryBuffer
from llama_index.storage.chat_store.azure import AzureChatStore

chat_store = AzureChatStore.from_account_and_key(
    account_name="",
    account_key="",
    chat_table_name="ChatUser",
)

memory = ChatMemoryBuffer.from_defaults(
    token_limit=3000,
    chat_store=chat_store,
    chat_store_key="conversation1",
)

chat_engine = SimpleChatEngine(
    memory=memory, llm=Settings.llm, prefix_messages=[]
)

response = chat_engine.chat("Hello.")

DynamoDBChatStore#

使用DynamoDBChatStore,您可以在AWS DynamoDB中存储聊天历史记录。

安装#

pip install llama-index-storage-chat-store-dynamodb

使用#

确保您已创建具有适当架构的DynamoDB表。以下是默认示例:

import boto3

# 获取服务资源
dynamodb = boto3.resource("dynamodb")

# 创建DynamoDB表
table = dynamodb.create_table(
    TableName="EXAMPLE_TABLE",
    KeySchema=[{"AttributeName": "SessionId", "KeyType": "HASH"}],
    AttributeDefinitions=[
        {"AttributeName": "SessionId", "AttributeType": "S"}
    ],
    BillingMode="PAY_PER_REQUEST",
)

然后您可以使用DynamoDBChatStore类来持久化和检索聊天历史:

import os
from llama_index.core.llms import ChatMessage, MessageRole
from llama_index.storage.chat_store.dynamodb.base import DynamoDBChatStore

# 初始化DynamoDB聊天存储
chat_store = DynamoDBChatStore(
    table_name="EXAMPLE_TABLE", profile_name=os.getenv("AWS_PROFILE")
)

# 不存在的聊天历史返回空数组
print(chat_store.get_messages("123"))
# >>> []

# 使用键"SessionID = 123"初始化聊天历史
messages = [
    ChatMessage(role=MessageRole.USER, content="Who are you?"),
    ChatMessage(
        role=MessageRole.ASSISTANT, content="I am your helpful AI assistant."
    ),
]
chat_store.set_messages(key="123", messages=messages)
print(chat_store.get_messages("123"))
# >>> [ChatMessage(role=<MessageRole.USER: 'user'>, content='Who are you?', additional_kwargs={}),
#      ChatMessage(role=<MessageRole.ASSISTANT: 'assistant'>, content='I am your helpful AI assistant.', additional_kwargs={})]]

# 向现有聊天历史追加消息
message = ChatMessage(role=MessageRole.USER, content="What can you do?")
chat_store.add_message(key="123", message=message)
print(chat_store.get_messages("123"))
# >>> [ChatMessage(role=<MessageRole.USER: 'user'>, content='Who are you?', additional_kwargs={}),
#      ChatMessage(role=<MessageRole.ASSISTANT: 'assistant'>, content='I am your helpful AI assistant.', additional_kwargs={})],
#      ChatMessage(role=<MessageRole.USER: 'user'>, content='What can you do?', additional_kwargs={})]

PostgresChatStore#

使用PostgresChatStore,您可以远程存储聊天历史记录,无需担心手动持久化和加载聊天历史。

from llama_index.storage.chat_store.postgres import PostgresChatStore
from llama_index.core.memory import ChatMemoryBuffer

chat_store = PostgresChatStore.from_uri(
    uri="postgresql+asyncpg://postgres:password@127.0.0.1:5432/database",
)

chat_memory = ChatMemoryBuffer.from_defaults(
    token_limit=3000,
    chat_store=chat_store,
    chat_store_key="user1",
)

TablestoreChatStore#

使用TablestoreChatStore,您可以远程存储聊天历史记录,无需担心手动持久化和加载聊天历史。

安装#

pip install llama-index-storage-chat-store-tablestore

使用#

from llama_index.storage.chat_store.tablestore import TablestoreChatStore
from llama_index.core.memory import ChatMemoryBuffer

# 1. 创建表格存储向量存储
chat_store = TablestoreChatStore(
    endpoint="<end_point>",
    instance_name="<instance_name>",
    access_key_id="<access_key_id>",
    access_key_secret="<access_key_secret>",
)
# 首次使用需要创建表
chat_store.create_table_if_not_exist()

chat_memory = ChatMemoryBuffer.from_defaults(
    token_limit=3000,
    chat_store=chat_store,
    chat_store_key="user1",
)

Google AlloyDB 聊天存储库#

使用AlloyDBChatStore,您可以在AlloyDB中存储聊天历史记录,无需担心手动持久化和加载聊天历史。

本教程演示同步接口。所有同步方法都有对应的异步方法。

安装#

pip install llama-index
pip install llama-index-alloydb-pg
pip install llama-index-llms-vertex

使用#

from llama_index.core.chat_engine import SimpleChatEngine
from llama_index.core.memory import ChatMemoryBuffer
from llama_index_alloydb_pg import AlloyDBChatStore, AlloyDBEngine
from llama_index.llms.vertex import Vertex
import asyncio

# 替换为您自己的AlloyDB信息
engine = AlloyDBEngine.from_instance(
    project_id=PROJECT_ID,
    region=REGION,
    cluster=CLUSTER,
    instance=INSTANCE,
    database=DATABASE,
    user=USER,
    password=PASSWORD,
)

engine.init_chat_store_table(table_name=TABLE_NAME)

chat_store = AlloyDBChatStore.create_sync(
    engine=engine,
    table_name=TABLE_NAME,
)

memory = ChatMemoryBuffer.from_defaults(
    token_limit=3000,
    chat_store=chat_store,
    chat_store_key="user1",
)

llm = Vertex(model="gemini-1.5-flash-002", project=PROJECT_ID)

chat_engine = SimpleChatEngine(memory=memory, llm=llm, prefix_messages=[])

response = chat_engine.chat("Hello.")

print(response)

Google Cloud SQL for PostgreSQL 聊天存储库#

使用PostgresChatStore,您可以在Cloud SQL for Postgres中存储聊天历史记录,无需担心手动持久化和加载聊天历史。

本教程演示同步接口。所有同步方法都有对应的异步方法。

安装#

pip install llama-index
pip install llama-index-cloud-sql-pg
pip install llama-index-llms-vertex

使用#

from llama_index.core.chat_engine import SimpleChatEngine
from llama_index.core.memory import ChatMemoryBuffer
from llama_index_cloud_sql_pg import PostgresChatStore, PostgresEngine
from llama_index.llms.vertex import Vertex
import asyncio

# 替换为您自己的Cloud SQL信息
engine = PostgresEngine.from_instance(
    project_id=PROJECT_ID,
    region=REGION,
    instance=INSTANCE,
    database=DATABASE,
    user=USER,
    password=PASSWORD,
)

engine.init_chat_store_table(table_name=TABLE_NAME)

chat_store = PostgresChatStore.create_sync(
    engine=engine,
    table_name=TABLE_NAME,
)

memory = ChatMemoryBuffer.from_defaults(
    token_limit=3000,
    chat_store=chat_store,
    chat_store_key="user1",
)

llm = Vertex(model="gemini-1.5-flash-002", project=PROJECT_ID)

chat_engine = SimpleChatEngine(memory=memory, llm=llm, prefix_messages=[])

response = chat_engine.chat("Hello.")

print(response)