Azure OpenAI¶
如果您在 Colab 上打开此 Notebook,可能需要安装 LlamaIndex 🦙。
In [ ]:
Copied!
%pip install llama-index-llms-azure-openai
%pip install llama-index-llms-azure-openai
In [ ]:
Copied!
!pip install llama-index
!pip install llama-index
先决条件¶
环境配置¶
查找您的配置信息 - API 基础地址、API 密钥、部署名称(即引擎)等¶
要获取必要的配置信息,请按以下步骤操作:
- 访问 Azure OpenAI Studio 此处
- 进入聊天或补全游乐场(根据您要配置的 LLM 类型选择)
- 点击"查看代码"(如下图所示)
In [ ]:
Copied!
from IPython.display import Image
Image(filename="./azure_playground.png")
from IPython.display import Image
Image(filename="./azure_playground.png")
Out[ ]:
- 记录下
api_type、api_base、api_version、engine(应与之前的"部署名称"保持一致)以及key
In [ ]:
Copied!
from IPython.display import Image
Image(filename="./azure_env.png")
from IPython.display import Image
Image(filename="./azure_env.png")
Out[ ]:
配置环境变量¶
使用 Azure 部署 OpenAI 模型与常规 OpenAI 服务非常相似。 您只需额外配置几个环境变量:
OPENAI_API_VERSION: 设置为2023-07-01-preview未来版本可能会变更。AZURE_OPENAI_ENDPOINT: 您的终端节点应类似以下格式 https://YOUR_RESOURCE_NAME.openai.azure.com/AZURE_OPENAI_API_KEY: 您的 API 密钥
In [ ]:
Copied!
import os
os.environ["AZURE_OPENAI_API_KEY"] = "<your-api-key>"
os.environ[
"AZURE_OPENAI_ENDPOINT"
] = "https://<your-resource-name>.openai.azure.com/"
os.environ["OPENAI_API_VERSION"] = "2023-07-01-preview"
import os
os.environ["AZURE_OPENAI_API_KEY"] = ""
os.environ[
"AZURE_OPENAI_ENDPOINT"
] = "https://.openai.azure.com/"
os.environ["OPENAI_API_VERSION"] = "2023-07-01-preview"
使用您的 LLM¶
In [ ]:
Copied!
from llama_index.llms.azure_openai import AzureOpenAI
from llama_index.llms.azure_openai import AzureOpenAI
与常规的 OpenAI 不同,除了 model 参数外,您还需要传递 engine 参数。engine 是您在 Azure OpenAI Studio 中选择的模型部署名称。更多细节请参阅前文「查找您的设置信息」章节。
In [ ]:
Copied!
llm = AzureOpenAI(
engine="simon-llm", model="gpt-35-turbo-16k", temperature=0.0
)
llm = AzureOpenAI(
engine="simon-llm", model="gpt-35-turbo-16k", temperature=0.0
)
或者,您也可以跳过环境变量的设置,直接通过构造函数传入参数。
In [ ]:
Copied!
llm = AzureOpenAI(
engine="my-custom-llm",
model="gpt-35-turbo-16k",
temperature=0.0,
azure_endpoint="https://<your-resource-name>.openai.azure.com/",
api_key="<your-api-key>",
api_version="2023-07-01-preview",
)
llm = AzureOpenAI(
engine="my-custom-llm",
model="gpt-35-turbo-16k",
temperature=0.0,
azure_endpoint="https://.openai.azure.com/",
api_key="",
api_version="2023-07-01-preview",
)
使用 complete 端点进行文本补全
In [ ]:
Copied!
response = llm.complete("The sky is a beautiful blue and")
print(response)
response = llm.complete("The sky is a beautiful blue and")
print(response)
the sun is shining brightly. Fluffy white clouds float lazily across the sky, creating a picturesque scene. The vibrant blue color of the sky brings a sense of calm and tranquility. It is a perfect day to be outside, enjoying the warmth of the sun and the gentle breeze. The sky seems to stretch endlessly, reminding us of the vastness and beauty of the world around us. It is a reminder to appreciate the simple pleasures in life and to take a moment to admire the natural wonders that surround us.
In [ ]:
Copied!
response = llm.stream_complete("The sky is a beautiful blue and")
for r in response:
print(r.delta, end="")
response = llm.stream_complete("The sky is a beautiful blue and")
for r in response:
print(r.delta, end="")
the sun is shining brightly. Fluffy white clouds float lazily across the sky, creating a picturesque scene. The vibrant blue color of the sky brings a sense of calm and tranquility. It is a perfect day to be outside, enjoying the warmth of the sun and the gentle breeze. The sky seems to stretch endlessly, reminding us of the vastness and beauty of the world around us. It is a reminder to appreciate the simple pleasures in life and to take a moment to pause and admire the natural wonders that surround us.
使用 chat 端点进行对话
In [ ]:
Copied!
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system", content="You are a pirate with colorful personality."
),
ChatMessage(role="user", content="Hello"),
]
response = llm.chat(messages)
print(response)
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system", content="You are a pirate with colorful personality."
),
ChatMessage(role="user", content="Hello"),
]
response = llm.chat(messages)
print(response)
assistant: Ahoy there, matey! How be ye on this fine day? I be Captain Jolly Roger, the most colorful pirate ye ever did lay eyes on! What brings ye to me ship?
In [ ]:
Copied!
response = llm.stream_chat(messages)
for r in response:
print(r.delta, end="")
response = llm.stream_chat(messages)
for r in response:
print(r.delta, end="")
Ahoy there, matey! How be ye on this fine day? I be Captain Jolly Roger, the most colorful pirate ye ever did lay eyes on! What brings ye to me ship?
相较于在每次聊天或完成调用时重复添加相同的参数,您可以通过 additional_kwargs 在实例级别统一设置这些参数。
In [ ]:
Copied!
llm = AzureOpenAI(
engine="simon-llm",
model="gpt-35-turbo-16k",
temperature=0.0,
additional_kwargs={"user": "your_user_id"},
)
llm = AzureOpenAI(
engine="simon-llm",
model="gpt-35-turbo-16k",
temperature=0.0,
additional_kwargs={"user": "your_user_id"},
)