OctoAI¶
如果您在 Colab 上打开此 Notebook,可能需要安装 LlamaIndex 🦙。
In [ ]:
Copied!
%pip install llama-index-llms-octoai
%pip install llama-index
%pip install octoai-sdk
%pip install llama-index-llms-octoai
%pip install llama-index
%pip install octoai-sdk
In [ ]:
Copied!
OCTOAI_API_KEY = ""
OCTOAI_API_KEY = ""
使用默认模型初始化集成¶
In [ ]:
Copied!
from llama_index.llms.octoai import OctoAI
octoai = OctoAI(token=OCTOAI_API_KEY)
from llama_index.llms.octoai import OctoAI
octoai = OctoAI(token=OCTOAI_API_KEY)
调用 complete
方法并传入提示词¶
In [ ]:
Copied!
response = octoai.complete("Paul Graham is ")
print(response)
response = octoai.complete("Paul Graham is ")
print(response)
使用消息列表调用 chat
方法¶
In [ ]:
Copied!
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system",
content="Below is an instruction that describes a task. Write a response that appropriately completes the request.",
),
ChatMessage(role="user", content="Write a blog about Seattle"),
]
response = octoai.chat(messages)
print(response)
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system",
content="Below is an instruction that describes a task. Write a response that appropriately completes the request.",
),
ChatMessage(role="user", content="Write a blog about Seattle"),
]
response = octoai.chat(messages)
print(response)
流式传输¶
使用 stream_complete
端点
In [ ]:
Copied!
response = octoai.stream_complete("Paul Graham is ")
for r in response:
print(r.delta, end="")
response = octoai.stream_complete("Paul Graham is ")
for r in response:
print(r.delta, end="")
使用 stream_chat
处理消息列表
In [ ]:
Copied!
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system",
content="Below is an instruction that describes a task. Write a response that appropriately completes the request.",
),
ChatMessage(role="user", content="Write a blog about Seattle"),
]
response = octoai.stream_chat(messages)
for r in response:
print(r.delta, end="")
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system",
content="Below is an instruction that describes a task. Write a response that appropriately completes the request.",
),
ChatMessage(role="user", content="Write a blog about Seattle"),
]
response = octoai.stream_chat(messages)
for r in response:
print(r.delta, end="")
配置模型¶
In [ ]:
Copied!
# To customize your API token, do this
# otherwise it will lookup OCTOAI_TOKEN from your env variable
octoai = OctoAI(
model="mistral-7b-instruct", max_tokens=128, token=OCTOAI_API_KEY
)
response = octoai.complete("Paul Graham is ")
print(response)
# To customize your API token, do this
# otherwise it will lookup OCTOAI_TOKEN from your env variable
octoai = OctoAI(
model="mistral-7b-instruct", max_tokens=128, token=OCTOAI_API_KEY
)
response = octoai.complete("Paul Graham is ")
print(response)