Databricks¶
集成 Databricks LLMs API。
前提条件¶
Databricks 个人访问令牌:用于查询和访问 Databricks 模型服务终端节点。
位于支持区域的 Databricks 工作区,以便使用按令牌付费的基础模型 API。
安装¶
如果您在 Colab 上打开此 Notebook,可能需要安装 LlamaIndex 🦙。
In [ ]:
Copied!
% pip install llama-index-llms-databricks
% pip install llama-index-llms-databricks
In [ ]:
Copied!
!pip install llama-index
!pip install llama-index
In [ ]:
Copied!
from llama_index.llms.databricks import Databricks
from llama_index.llms.databricks import Databricks
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
export DATABRICKS_TOKEN=<your api key>
export DATABRICKS_SERVING_ENDPOINT=<your api serving endpoint>
或者,您也可以在初始化LLM时传入API密钥和服务端点:
In [ ]:
Copied!
llm = Databricks(
model="databricks-dbrx-instruct",
api_key="your_api_key",
api_base="https://[your-work-space].cloud.databricks.com/serving-endpoints/",
)
llm = Databricks(
model="databricks-dbrx-instruct",
api_key="your_api_key",
api_base="https://[your-work-space].cloud.databricks.com/serving-endpoints/",
)
可用的 LLM 模型列表详见此处。
In [ ]:
Copied!
response = llm.complete("Explain the importance of open source LLMs")
response = llm.complete("Explain the importance of open source LLMs")
In [ ]:
Copied!
print(response)
print(response)
使用消息列表调用 chat
方法¶
In [ ]:
Copied!
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality"
),
ChatMessage(role="user", content="What is your name"),
]
resp = llm.chat(messages)
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality"
),
ChatMessage(role="user", content="What is your name"),
]
resp = llm.chat(messages)
In [ ]:
Copied!
print(resp)
print(resp)
流式传输¶
使用 stream_complete
端点
In [ ]:
Copied!
response = llm.stream_complete("Explain the importance of open source LLMs")
response = llm.stream_complete("Explain the importance of open source LLMs")
In [ ]:
Copied!
for r in response:
print(r.delta, end="")
for r in response:
print(r.delta, end="")
使用 stream_chat
端点
In [ ]:
Copied!
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality"
),
ChatMessage(role="user", content="What is your name"),
]
resp = llm.stream_chat(messages)
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality"
),
ChatMessage(role="user", content="What is your name"),
]
resp = llm.stream_chat(messages)
In [ ]:
Copied!
for r in resp:
print(r.delta, end="")
for r in resp:
print(r.delta, end="")